{"title":"Deep Learning Based Prediction of Signal-to-Noise Ratio (SNR) for LTE and 5G Systems","authors":"Thi-Phuong-Nhung Ngo, B. Kelley, P. Rad","doi":"10.1109/WINCOM50532.2020.9272470","DOIUrl":null,"url":null,"abstract":"Deep learning (DL) is applied to predict signal-to-noise ratio (SNR) in de facto LTE and 5G systems in a non-data-aided (NDA) manner. Various channel conditions and impairments are considered, including modulation types, path delays, and Doppler shifts. Both time-domain and frequency-domain signal grids are evaluated as inputs for SNR prediction. A combination of convolutional neural network (CNN) and long short term memory (LSTM) - CNN-LSTM - is used as the SNR predictor. Learning both spatial and temporal features is known to improve DL prediction accuracy. Techniques employed to enhance performance are SNR range/resolution manipulation, binary prediction, and multiple input prediction. Computer simulation is conducted using MATLAB LTE, 5G, and DL toolboxes to generate OFDM signals, model fading channels with AWGN noise, and construct CNN-LSTM. Simulation results show, with off-line training, DL based prediction of SNR in LTE and 5G systems has better accuracy and latency than traditional estimation techniques. Specifically, SNR prediction for SNR range of [-4, 32] dB and resolution of 2 dB utilizing time-domain signals has an accuracy of 100%, hence normalized mean square error (NMSE) of zero, and a latency of 1 millisecond or less.","PeriodicalId":283907,"journal":{"name":"2020 8th International Conference on Wireless Networks and Mobile Communications (WINCOM)","volume":"115 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 8th International Conference on Wireless Networks and Mobile Communications (WINCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WINCOM50532.2020.9272470","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
Deep learning (DL) is applied to predict signal-to-noise ratio (SNR) in de facto LTE and 5G systems in a non-data-aided (NDA) manner. Various channel conditions and impairments are considered, including modulation types, path delays, and Doppler shifts. Both time-domain and frequency-domain signal grids are evaluated as inputs for SNR prediction. A combination of convolutional neural network (CNN) and long short term memory (LSTM) - CNN-LSTM - is used as the SNR predictor. Learning both spatial and temporal features is known to improve DL prediction accuracy. Techniques employed to enhance performance are SNR range/resolution manipulation, binary prediction, and multiple input prediction. Computer simulation is conducted using MATLAB LTE, 5G, and DL toolboxes to generate OFDM signals, model fading channels with AWGN noise, and construct CNN-LSTM. Simulation results show, with off-line training, DL based prediction of SNR in LTE and 5G systems has better accuracy and latency than traditional estimation techniques. Specifically, SNR prediction for SNR range of [-4, 32] dB and resolution of 2 dB utilizing time-domain signals has an accuracy of 100%, hence normalized mean square error (NMSE) of zero, and a latency of 1 millisecond or less.