A method for elevated ducts refinement based on convolutional neural network

IF 1.6 4区 地球科学 Q3 ASTRONOMY & ASTROPHYSICS
Radio Science Pub Date : 2024-06-01 DOI:10.1029/2023RS007789
Xunyang Zhu;Ke Yan;Liquan Jiang;Ling Tian;Bin Tian
{"title":"A method for elevated ducts refinement based on convolutional neural network","authors":"Xunyang Zhu;Ke Yan;Liquan Jiang;Ling Tian;Bin Tian","doi":"10.1029/2023RS007789","DOIUrl":null,"url":null,"abstract":"Elevated duct (EleD) is an abnormal atmospheric refraction structure with a suspended trapped layer. The precise and highly resolved elevated duct-height-based data (EleDH) is crucial for radio communication systems, especially in electromagnetic wave path loss prediction and EleDH-producing systems. However, producing high-resolution EleDH is challenging because of the massive details in the EleDH data. Direct and high-time refinement procedures mostly lead to unrealistic outcomes. The study provides a Dense-Linear convolutional neural network (DLCNN)-based EleDH refinement technique based on the development of statistical downscaling and super-resolution technologies. Additionally, the stack approach is used, and the refining order is taken into consideration to ensure precision in high-time refinement and provide reliable outcomes. To demonstrate the strength of DLCNN in capturing complex internal characteristics of EleDH, a new EleD data set is first funded, which only contains the duct height. From this data set, we use the duct height as the core refinement of the EleD's trapped layer and the thickness of the trapped layer to ensure reliable duct height. Seven super-resolution models are utilized for fair comparisons. The experimental results prove that the DLCNN has the highest refinement performance; also, it obtained excellent generalization capacity, where the minimum and maximum obtained Accuracy(20%), MAE, and RMSE were 85.22% ∓ 88.30%, 36.09 ∓ 45.97 and 8.68 ∓ 10.14, respectively. High-resolution EleDH improves path loss prediction, where the minimum and maximum obtained bias were 2.37 ∓ 9.51 dB.","PeriodicalId":49638,"journal":{"name":"Radio Science","volume":"59 6","pages":"1-22"},"PeriodicalIF":1.6000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Radio Science","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10579707/","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Elevated duct (EleD) is an abnormal atmospheric refraction structure with a suspended trapped layer. The precise and highly resolved elevated duct-height-based data (EleDH) is crucial for radio communication systems, especially in electromagnetic wave path loss prediction and EleDH-producing systems. However, producing high-resolution EleDH is challenging because of the massive details in the EleDH data. Direct and high-time refinement procedures mostly lead to unrealistic outcomes. The study provides a Dense-Linear convolutional neural network (DLCNN)-based EleDH refinement technique based on the development of statistical downscaling and super-resolution technologies. Additionally, the stack approach is used, and the refining order is taken into consideration to ensure precision in high-time refinement and provide reliable outcomes. To demonstrate the strength of DLCNN in capturing complex internal characteristics of EleDH, a new EleD data set is first funded, which only contains the duct height. From this data set, we use the duct height as the core refinement of the EleD's trapped layer and the thickness of the trapped layer to ensure reliable duct height. Seven super-resolution models are utilized for fair comparisons. The experimental results prove that the DLCNN has the highest refinement performance; also, it obtained excellent generalization capacity, where the minimum and maximum obtained Accuracy(20%), MAE, and RMSE were 85.22% ∓ 88.30%, 36.09 ∓ 45.97 and 8.68 ∓ 10.14, respectively. High-resolution EleDH improves path loss prediction, where the minimum and maximum obtained bias were 2.37 ∓ 9.51 dB.
基于卷积神经网络的高架管道细化方法
高架风道(EleD)是一种异常的大气折射结构,具有悬浮的陷波层。精确和高分辨率的基于抬升风道高度的数据(EleDH)对于无线电通信系统,特别是电磁波路径损耗预测和 EleDH 生成系统至关重要。然而,由于 EleDH 数据中存在大量细节,因此生成高分辨率 EleDH 极具挑战性。直接和高时间细化程序大多会导致不切实际的结果。本研究基于统计降尺度和超分辨率技术的发展,提供了一种基于密集线性卷积神经网络(DLCNN)的 EleDH 精细化技术。此外,该研究还采用了堆栈方法,并考虑了细化顺序,以确保高时间细化的精确性,并提供可靠的结果。为了证明 DLCNN 在捕捉 EleDH 复杂内部特征方面的优势,我们首先资助了一个新的 EleD 数据集,该数据集仅包含管道高度。在这组数据中,我们以风道高度为核心细化了 EleD 的陷波层和陷波层厚度,以确保可靠的风道高度。为了进行公平比较,我们使用了七个超分辨率模型。实验结果证明,DLCNN 具有最高的细化性能,并获得了出色的泛化能力,其最小和最大精度(20%)、MAE 和 RMSE 分别为 85.22% ∓ 88.30%、36.09 ∓ 45.97 和 8.68 ∓ 10.14。高分辨率 EleDH 改善了路径损耗预测,获得的最小和最大偏差分别为 2.37 ∓ 9.51 dB。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Radio Science
Radio Science 工程技术-地球化学与地球物理
CiteScore
3.30
自引率
12.50%
发文量
112
审稿时长
1 months
期刊介绍: Radio Science (RDS) publishes original scientific contributions on radio-frequency electromagnetic-propagation and its applications. Contributions covering measurement, modelling, prediction and forecasting techniques pertinent to fields and waves - including antennas, signals and systems, the terrestrial and space environment and radio propagation problems in radio astronomy - are welcome. Contributions may address propagation through, interaction with, and remote sensing of structures, geophysical media, plasmas, and materials, as well as the application of radio frequency electromagnetic techniques to remote sensing of the Earth and other bodies in the solar system.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信