{"title":"High-Resolution Augmented Multimodal Sensing of Distributed Radar Network","authors":"Anum Pirkani;Dillon Kumar;Edward Hoare;Muge Bekar;Natalie Reeves;Mikhail Cherniakov;Marina Gashinova","doi":"10.1109/TRS.2025.3581396","DOIUrl":null,"url":null,"abstract":"Advancement toward fully autonomous systems requires enhanced sensing and perception, particularly a 360° vision for safe maneuvering. One approach to achieving this is through a distributed network of radar sensors, operating in homogeneous or heterogeneous configurations, strategically positioned to provide increased coverage and visibility in otherwise blind regions. Such a multiperspective sensing network, complemented with multimodal signal processing, can significantly improve the angular resolution of the radar, delivering high-fidelity scene imagery essential for region classification and path planning. This study presents a methodology for multimodal and multiperspective sensing using heterogeneous radar sensors, utilizing Doppler beam sharpening (DBS) within multiple-input-multiple-output (MIMO) radars to enhance the resolution and coverage. Traditional frequency-modulated continuous wave (FMCW)–MIMO radars, currently the most widely used configuration, are prone to Doppler aliasing, limiting the field of view (FoV) in DBS and MIMO–DBS processing. To address this limitation, the effective FoV in multiperspective image is extended to that provided by the radar’s physical aperture. The proposed framework is validated using 77-GHz radar chipsets in both automotive and maritime conditions, with sensors mounted in front-looking, corner-looking, and side-looking orientations.","PeriodicalId":100645,"journal":{"name":"IEEE Transactions on Radar Systems","volume":"3 ","pages":"905-918"},"PeriodicalIF":0.0000,"publicationDate":"2025-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Radar Systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11044409/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Advancement toward fully autonomous systems requires enhanced sensing and perception, particularly a 360° vision for safe maneuvering. One approach to achieving this is through a distributed network of radar sensors, operating in homogeneous or heterogeneous configurations, strategically positioned to provide increased coverage and visibility in otherwise blind regions. Such a multiperspective sensing network, complemented with multimodal signal processing, can significantly improve the angular resolution of the radar, delivering high-fidelity scene imagery essential for region classification and path planning. This study presents a methodology for multimodal and multiperspective sensing using heterogeneous radar sensors, utilizing Doppler beam sharpening (DBS) within multiple-input-multiple-output (MIMO) radars to enhance the resolution and coverage. Traditional frequency-modulated continuous wave (FMCW)–MIMO radars, currently the most widely used configuration, are prone to Doppler aliasing, limiting the field of view (FoV) in DBS and MIMO–DBS processing. To address this limitation, the effective FoV in multiperspective image is extended to that provided by the radar’s physical aperture. The proposed framework is validated using 77-GHz radar chipsets in both automotive and maritime conditions, with sensors mounted in front-looking, corner-looking, and side-looking orientations.