{"title":"Dynamic Conjugate Gradient Unfolding for Symbol Detection in Time-Varying Massive MIMO","authors":"Toluwaleke Olutayo;Benoit Champagne","doi":"10.1109/OJVT.2024.3410834","DOIUrl":null,"url":null,"abstract":"This article addresses the problem of symbol detection in time-varying Massive Multiple-Input Multiple-Output (M-MIMO) systems. While conventional detection techniques either exhibit subpar performance or impose excessive computational burdens in such systems, learning-based methods which have shown great potential in stationary scenarios, struggle to adapt to non-stationary conditions. To address these challenges, we introduce innovative extensions to the Learned Conjugate Gradient Network (LcgNet) M-MIMO detector. Firstly, we expound Preconditioned LcgNet (PrLcgNet), which incorporates a preconditioner during training to enhance the uplink M-MIMO detector's filter matrix. This modification enables the detector to achieve faster convergence with fewer layers compared to the original approach. Secondly, we introduce an adaptation of PrLcgNet referred to as Dynamic Conjugate Gradient Network (DyCoGNet), specifically designed for time-varying environments. DyCoGNet leverages self-supervised learning with Forward Error Correction (FEC), enabling autonomous adaptation without the need for explicit labeled data. It also employs meta-learning, facilitating rapid adaptation to unforeseen channel conditions. Our simulation results demonstrate that in stationary scenarios, PrLcgNet achieves faster convergence than LCgNet, which can be leveraged to reduce system complexity or improve Symbol Error Rate (SER) performance. Furthermore, in non-stationary scenarios, DyCoGNet exhibits rapid and efficient adaptation, achieving significant SER performance gains compared to baseline cases without meta-learning and a recent benchmark using self-supervised learning.","PeriodicalId":34270,"journal":{"name":"IEEE Open Journal of Vehicular Technology","volume":"5 ","pages":"792-806"},"PeriodicalIF":5.3000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10551475","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of Vehicular Technology","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10551475/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
This article addresses the problem of symbol detection in time-varying Massive Multiple-Input Multiple-Output (M-MIMO) systems. While conventional detection techniques either exhibit subpar performance or impose excessive computational burdens in such systems, learning-based methods which have shown great potential in stationary scenarios, struggle to adapt to non-stationary conditions. To address these challenges, we introduce innovative extensions to the Learned Conjugate Gradient Network (LcgNet) M-MIMO detector. Firstly, we expound Preconditioned LcgNet (PrLcgNet), which incorporates a preconditioner during training to enhance the uplink M-MIMO detector's filter matrix. This modification enables the detector to achieve faster convergence with fewer layers compared to the original approach. Secondly, we introduce an adaptation of PrLcgNet referred to as Dynamic Conjugate Gradient Network (DyCoGNet), specifically designed for time-varying environments. DyCoGNet leverages self-supervised learning with Forward Error Correction (FEC), enabling autonomous adaptation without the need for explicit labeled data. It also employs meta-learning, facilitating rapid adaptation to unforeseen channel conditions. Our simulation results demonstrate that in stationary scenarios, PrLcgNet achieves faster convergence than LCgNet, which can be leveraged to reduce system complexity or improve Symbol Error Rate (SER) performance. Furthermore, in non-stationary scenarios, DyCoGNet exhibits rapid and efficient adaptation, achieving significant SER performance gains compared to baseline cases without meta-learning and a recent benchmark using self-supervised learning.