ReCPos: Deep Learning Network for 5G NR High-precision Positioning

Fangyi Yu, Long Zhao, Xinfang Chen, Hongrui Shen
{"title":"ReCPos: Deep Learning Network for 5G NR High-precision Positioning","authors":"Fangyi Yu, Long Zhao, Xinfang Chen, Hongrui Shen","doi":"10.1109/ICCCWorkshops57813.2023.10233803","DOIUrl":null,"url":null,"abstract":"With the increasing demand for location-based services, various positioning technologies have been proposed. However, the existing positioning technologies are often limited to the line-of-sight (LOS) scenario and incapable of providing accurate results under non-LoS (NLOS) scenario. Therefore, we propose ReCPos net, a deep residual convolutional neural network for high-precision positioning in heavy NLOS scenario, where four residual modules with distinct structures are designed. After the preprocessing for the channel impulse response (CIR) and reference signal received power (RSRP), the designed residual modules are adopted to extract the high-dimensional feature vector, then the location can be predicted by the ReCPos net. The experiment results indicate that the proposed ReCPos could reduce the positioning error by at least 20.0% compared to the existing AI-based positioning schemes in heavy NLOS conditions with lower model complexity and computing power. The input of CIR combined with RSRP and truncating CIR outperforms that of CIR alone in terms of positioning accuracy.","PeriodicalId":201450,"journal":{"name":"2023 IEEE/CIC International Conference on Communications in China (ICCC Workshops)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE/CIC International Conference on Communications in China (ICCC Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCCWorkshops57813.2023.10233803","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the increasing demand for location-based services, various positioning technologies have been proposed. However, the existing positioning technologies are often limited to the line-of-sight (LOS) scenario and incapable of providing accurate results under non-LoS (NLOS) scenario. Therefore, we propose ReCPos net, a deep residual convolutional neural network for high-precision positioning in heavy NLOS scenario, where four residual modules with distinct structures are designed. After the preprocessing for the channel impulse response (CIR) and reference signal received power (RSRP), the designed residual modules are adopted to extract the high-dimensional feature vector, then the location can be predicted by the ReCPos net. The experiment results indicate that the proposed ReCPos could reduce the positioning error by at least 20.0% compared to the existing AI-based positioning schemes in heavy NLOS conditions with lower model complexity and computing power. The input of CIR combined with RSRP and truncating CIR outperforms that of CIR alone in terms of positioning accuracy.
ReCPos: 5G NR高精度定位的深度学习网络
随着定位服务需求的不断增长,各种定位技术应运而生。然而,现有的定位技术往往局限于视距(LOS)场景,无法提供非视距(NLOS)场景下的准确结果。因此,我们提出了ReCPos网络,这是一种用于重型NLOS场景高精度定位的深度残差卷积神经网络,其中设计了四个不同结构的残差模块。在对信道脉冲响应(CIR)和参考信号接收功率(RSRP)进行预处理后,采用设计的残差模块提取高维特征向量,然后利用ReCPos网络进行位置预测。实验结果表明,在重NLOS条件下,与现有的基于人工智能的定位方案相比,该方案的定位误差至少降低了20.0%,且模型复杂度和计算能力较低。在定位精度方面,CIR与RSRP和截断CIR相结合的输入优于单独的CIR。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信