An Efficient Ray-Tracing Framework for Urban Scenarios Powered by Heterogeneous Graph Neural Networks

IF 4.8 2区 计算机科学 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Shiyuan Liu;Yujie Feng;Songjiang Yang;Bo Ma;Chuanhuang Li
{"title":"An Efficient Ray-Tracing Framework for Urban Scenarios Powered by Heterogeneous Graph Neural Networks","authors":"Shiyuan Liu;Yujie Feng;Songjiang Yang;Bo Ma;Chuanhuang Li","doi":"10.1109/LAWP.2025.3576376","DOIUrl":null,"url":null,"abstract":"In this letter, we propose an accurate and efficient heterogeneous graph neural network-powered ray tracing (HGNN-RT) framework to predict path loss for urban scenarios. The proposed HGNN-RT effectively captures both global and local features, which include broad path attributes, such as total path length and line-of-sight conditions, as well as specific interactions, such as reflection angles and obstacles. Moreover, due to the fast information flow of the graph neural networks, the HGNN-RT achieves efficiency-improved path loss prediction in complex multireflection and multipath propagation channels. Experimental results demonstrate that the proposed approach achieves errors within 2 dB of ray tracing benchmarks, while also being time-efficient and reliably generalizing to unseen environments for channel modeling.","PeriodicalId":51059,"journal":{"name":"IEEE Antennas and Wireless Propagation Letters","volume":"24 9","pages":"2884-2888"},"PeriodicalIF":4.8000,"publicationDate":"2025-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Antennas and Wireless Propagation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11023214/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

In this letter, we propose an accurate and efficient heterogeneous graph neural network-powered ray tracing (HGNN-RT) framework to predict path loss for urban scenarios. The proposed HGNN-RT effectively captures both global and local features, which include broad path attributes, such as total path length and line-of-sight conditions, as well as specific interactions, such as reflection angles and obstacles. Moreover, due to the fast information flow of the graph neural networks, the HGNN-RT achieves efficiency-improved path loss prediction in complex multireflection and multipath propagation channels. Experimental results demonstrate that the proposed approach achieves errors within 2 dB of ray tracing benchmarks, while also being time-efficient and reliably generalizing to unseen environments for channel modeling.
基于异构图神经网络的城市场景高效光线追踪框架
在这封信中,我们提出了一个准确高效的异构图神经网络驱动的光线追踪(HGNN-RT)框架来预测城市场景的路径损失。提出的HGNN-RT有效地捕获了全局和局部特征,其中包括广泛的路径属性,如总路径长度和视线条件,以及特定的相互作用,如反射角和障碍物。此外,由于图神经网络的快速信息流,HGNN-RT在复杂的多反射和多径传播信道中实现了提高效率的路径损失预测。实验结果表明,该方法在光线追踪基准误差在2 dB以内,同时具有时间效率和可靠的推广到不可见环境的通道建模。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
8.00
自引率
9.50%
发文量
529
审稿时长
1.0 months
期刊介绍: IEEE Antennas and Wireless Propagation Letters (AWP Letters) is devoted to the rapid electronic publication of short manuscripts in the technical areas of Antennas and Wireless Propagation. These are areas of competence for the IEEE Antennas and Propagation Society (AP-S). AWPL aims to be one of the "fastest" journals among IEEE publications. This means that for papers that are eventually accepted, it is intended that an author may expect his or her paper to appear in IEEE Xplore, on average, around two months after submission.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信