Majumder Haider;Md. Zoheb Hassan;Imtiaz Ahmed;Jeffrey H. Reed;Ahmed Rubaai;Danda B. Rawat
{"title":"工业物联网网络中高斯源的深度学习辅助最小均方误差估计","authors":"Majumder Haider;Md. Zoheb Hassan;Imtiaz Ahmed;Jeffrey H. Reed;Ahmed Rubaai;Danda B. Rawat","doi":"10.1109/TICPS.2024.3420823","DOIUrl":null,"url":null,"abstract":"This article investigates the problem of estimating complex-valued Gaussian signals in an industrial Internet of Things (IIoT) environment, where the channel fading is temporally correlated and modeled by a finite state Markov process. To address the non-trivial problem of estimating channel fading states and signals simultaneously, we propose two deep learning (DL)-aided minimum mean square error (MMSE) estimation schemes. More specifically, our proposed framework consists of two steps, (i) a DL-aided channel fading state estimation and prediction step, followed by (ii) a linear MMSE estimation step to estimate the source signals for the learned channel fading states. Our proposed framework employs three DL models, namely the fully connected deep neural network (DNN), long short-term memory (LSTM) integrated DNN, and temporal convolution network (TCN). Extensive simulations show that these three DL models achieve similar accuracy in predicting the states of wireless fading channels. Our proposed data-driven approaches exhibit a reasonable performance gap in normalized mean square error (NMSE) compared to the genie-aided scheme, which considers perfect knowledge of instantaneous channel fading states.","PeriodicalId":100640,"journal":{"name":"IEEE Transactions on Industrial Cyber-Physical Systems","volume":"2 ","pages":"185-195"},"PeriodicalIF":0.0000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep Learning Aided Minimum Mean Square Error Estimation of Gaussian Source in Industrial Internet-of-Things Networks\",\"authors\":\"Majumder Haider;Md. Zoheb Hassan;Imtiaz Ahmed;Jeffrey H. Reed;Ahmed Rubaai;Danda B. Rawat\",\"doi\":\"10.1109/TICPS.2024.3420823\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This article investigates the problem of estimating complex-valued Gaussian signals in an industrial Internet of Things (IIoT) environment, where the channel fading is temporally correlated and modeled by a finite state Markov process. To address the non-trivial problem of estimating channel fading states and signals simultaneously, we propose two deep learning (DL)-aided minimum mean square error (MMSE) estimation schemes. More specifically, our proposed framework consists of two steps, (i) a DL-aided channel fading state estimation and prediction step, followed by (ii) a linear MMSE estimation step to estimate the source signals for the learned channel fading states. Our proposed framework employs three DL models, namely the fully connected deep neural network (DNN), long short-term memory (LSTM) integrated DNN, and temporal convolution network (TCN). Extensive simulations show that these three DL models achieve similar accuracy in predicting the states of wireless fading channels. Our proposed data-driven approaches exhibit a reasonable performance gap in normalized mean square error (NMSE) compared to the genie-aided scheme, which considers perfect knowledge of instantaneous channel fading states.\",\"PeriodicalId\":100640,\"journal\":{\"name\":\"IEEE Transactions on Industrial Cyber-Physical Systems\",\"volume\":\"2 \",\"pages\":\"185-195\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Industrial Cyber-Physical Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10579043/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Industrial Cyber-Physical Systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10579043/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deep Learning Aided Minimum Mean Square Error Estimation of Gaussian Source in Industrial Internet-of-Things Networks
This article investigates the problem of estimating complex-valued Gaussian signals in an industrial Internet of Things (IIoT) environment, where the channel fading is temporally correlated and modeled by a finite state Markov process. To address the non-trivial problem of estimating channel fading states and signals simultaneously, we propose two deep learning (DL)-aided minimum mean square error (MMSE) estimation schemes. More specifically, our proposed framework consists of two steps, (i) a DL-aided channel fading state estimation and prediction step, followed by (ii) a linear MMSE estimation step to estimate the source signals for the learned channel fading states. Our proposed framework employs three DL models, namely the fully connected deep neural network (DNN), long short-term memory (LSTM) integrated DNN, and temporal convolution network (TCN). Extensive simulations show that these three DL models achieve similar accuracy in predicting the states of wireless fading channels. Our proposed data-driven approaches exhibit a reasonable performance gap in normalized mean square error (NMSE) compared to the genie-aided scheme, which considers perfect knowledge of instantaneous channel fading states.