{"title":"ProFus: Progressive Radar–Vision Heterogeneous Modality Fusion for Maritime Target Detection","authors":"Jingang Wang;Shikai Wu;Peng Liu","doi":"10.1109/LGRS.2025.3601131","DOIUrl":null,"url":null,"abstract":"Maritime monitoring is crucial in both civilian and military applications, with shore-based radar and visual systems widely used due to their cost effectiveness. However, single-sensor methods have notable limitations: radar systems, while offering wide detection coverage, suffer from high false alarm rates and lack detailed target information, whereas visual systems provide rich details but perform poorly in adverse weather conditions such as rain and fog. To address these issues, this letter proposes a progressive radar–vision fusion method for surface target detection. Due to the significant differences in data characteristics between radar and visual sensors, direct fusion is nearly infeasible. Instead, the proposed method adopts a stepwise fusion strategy, consisting of coordinate calibration, shallow feature fusion, and deep feature integration. Experimental results show that this approach achieves an <inline-formula> <tex-math>$\\text {mAP}_{50}$ </tex-math></inline-formula> of 86.7% and an <inline-formula> <tex-math>$\\text {mAP}_{75}$ </tex-math></inline-formula> of 54.5%, outperforming YOLOv10 by 1.0% and 1.5%, respectively. Moreover, the proposed method significantly surpasses existing state-of-the-art radar–vision fusion approaches, demonstrating its superior effectiveness in complex environments.","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":4.4000,"publicationDate":"2025-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11133591/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Maritime monitoring is crucial in both civilian and military applications, with shore-based radar and visual systems widely used due to their cost effectiveness. However, single-sensor methods have notable limitations: radar systems, while offering wide detection coverage, suffer from high false alarm rates and lack detailed target information, whereas visual systems provide rich details but perform poorly in adverse weather conditions such as rain and fog. To address these issues, this letter proposes a progressive radar–vision fusion method for surface target detection. Due to the significant differences in data characteristics between radar and visual sensors, direct fusion is nearly infeasible. Instead, the proposed method adopts a stepwise fusion strategy, consisting of coordinate calibration, shallow feature fusion, and deep feature integration. Experimental results show that this approach achieves an $\text {mAP}_{50}$ of 86.7% and an $\text {mAP}_{75}$ of 54.5%, outperforming YOLOv10 by 1.0% and 1.5%, respectively. Moreover, the proposed method significantly surpasses existing state-of-the-art radar–vision fusion approaches, demonstrating its superior effectiveness in complex environments.