LSTM-CNN for Behavioral Modeling and Predistortion of 5G Power Amplifiers

Wen Wang, Lu Sun, Haoming Liu, Yibo Feng
{"title":"LSTM-CNN for Behavioral Modeling and Predistortion of 5G Power Amplifiers","authors":"Wen Wang, Lu Sun, Haoming Liu, Yibo Feng","doi":"10.1109/MAPE53743.2022.9935205","DOIUrl":null,"url":null,"abstract":"In the Fifth Generation (5G) communication system, power amplifiers (PAs) have serious nonlinear distortion and memory effect. For this reason, this paper proposes a neural network model to linearize PAs. Common PA models such as the long short-term memory (LSTM) network and the deep neural network (DNN) have high complexity problems. Therefore, this paper proposes a behavioral model consisting of LSTM and one dimensional convolutional neural network (1D-CNN), namely LSTM-CNN, for PAs. The LSTM layer is proposed to extract time series information of the input signal to simulate the memory effect of the PA and 1D-CNN structure is used to model the nonlinear characteristics of the PA and reduce model complexity. In addition, the predistortion structure of the PA inverse model based on iteration is used to implement linearization. Finally, modeling results of the class F-PA with the proposed LSTM-CNN show that normalized mean square error (NMSE) can reach about −45 dB. Digital predistortion (DPD) results show that the adjacent channel power ratio (ACPR) can be improved by 14 dB.","PeriodicalId":442568,"journal":{"name":"2022 IEEE 9th International Symposium on Microwave, Antenna, Propagation and EMC Technologies for Wireless Communications (MAPE)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 9th International Symposium on Microwave, Antenna, Propagation and EMC Technologies for Wireless Communications (MAPE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MAPE53743.2022.9935205","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In the Fifth Generation (5G) communication system, power amplifiers (PAs) have serious nonlinear distortion and memory effect. For this reason, this paper proposes a neural network model to linearize PAs. Common PA models such as the long short-term memory (LSTM) network and the deep neural network (DNN) have high complexity problems. Therefore, this paper proposes a behavioral model consisting of LSTM and one dimensional convolutional neural network (1D-CNN), namely LSTM-CNN, for PAs. The LSTM layer is proposed to extract time series information of the input signal to simulate the memory effect of the PA and 1D-CNN structure is used to model the nonlinear characteristics of the PA and reduce model complexity. In addition, the predistortion structure of the PA inverse model based on iteration is used to implement linearization. Finally, modeling results of the class F-PA with the proposed LSTM-CNN show that normalized mean square error (NMSE) can reach about −45 dB. Digital predistortion (DPD) results show that the adjacent channel power ratio (ACPR) can be improved by 14 dB.
基于LSTM-CNN的5G功率放大器行为建模与预失真
在第五代(5G)通信系统中,功率放大器存在严重的非线性失真和记忆效应。为此,本文提出了一种神经网络模型来线性化pa。长短期记忆(LSTM)网络和深度神经网络(DNN)等常用的PA模型具有较高的复杂性问题。因此,本文提出了一种由LSTM和一维卷积神经网络(1D-CNN)组成的pa行为模型,即LSTM- cnn。提出LSTM层提取输入信号的时间序列信息来模拟PA的记忆效应,采用1D-CNN结构对PA的非线性特性进行建模,降低模型复杂度。此外,利用基于迭代的PA逆模型的预失真结构实现线性化。最后,利用所提出的LSTM-CNN对F-PA类进行建模的结果表明,归一化均方误差(NMSE)可达到−45 dB左右。数字预失真(DPD)结果表明,相邻信道功率比(ACPR)可提高14 dB。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信