Yumeng Hong , Jun Pan , Jiangong Xu , Shuying Jin , Junli Li
{"title":"SMAF-net:语义引导模态转移和分层特征融合的光学sar图像配准","authors":"Yumeng Hong , Jun Pan , Jiangong Xu , Shuying Jin , Junli Li","doi":"10.1016/j.jag.2025.104827","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate registration of optical and synthetic aperture radar (SAR) images is critical for effective fusion in remote sensing applications. To address the significant radiometric and geometric differences between these modalities, SMAF-Net, a novel network that integrates semantics-guided modality transfer and hierarchical feature fusion for optical-SAR image registration, is proposed. For modality transfer, a feature-constrained generative adversarial module (SGMT) is used to translate SAR to pseudo-optical images. By incorporating deep features from a multiscale feature learning module (MFLM) as semantic constraints, the translated images preserve structural details and reduce modality discrepancies. For feature matching, a channel attention-based hierarchical aggregation module (CA-HAM) is designed to effectively fuse multi-level features. Combined with a joint detection-description strategy, the network enables accurate keypoint detection and descriptor extraction.<!--> <!-->Experiments on optical-SAR datasets show that the proposed method achieves an average registration error of 2.26 pixels, outperforming state-of-the-art (SOTA) methods and enabling accurate registration between optical and SAR images.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"143 ","pages":"Article 104827"},"PeriodicalIF":8.6000,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SMAF-net: semantics-guided modality transfer and hierarchical feature fusion for optical-SAR image registration\",\"authors\":\"Yumeng Hong , Jun Pan , Jiangong Xu , Shuying Jin , Junli Li\",\"doi\":\"10.1016/j.jag.2025.104827\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurate registration of optical and synthetic aperture radar (SAR) images is critical for effective fusion in remote sensing applications. To address the significant radiometric and geometric differences between these modalities, SMAF-Net, a novel network that integrates semantics-guided modality transfer and hierarchical feature fusion for optical-SAR image registration, is proposed. For modality transfer, a feature-constrained generative adversarial module (SGMT) is used to translate SAR to pseudo-optical images. By incorporating deep features from a multiscale feature learning module (MFLM) as semantic constraints, the translated images preserve structural details and reduce modality discrepancies. For feature matching, a channel attention-based hierarchical aggregation module (CA-HAM) is designed to effectively fuse multi-level features. Combined with a joint detection-description strategy, the network enables accurate keypoint detection and descriptor extraction.<!--> <!-->Experiments on optical-SAR datasets show that the proposed method achieves an average registration error of 2.26 pixels, outperforming state-of-the-art (SOTA) methods and enabling accurate registration between optical and SAR images.</div></div>\",\"PeriodicalId\":73423,\"journal\":{\"name\":\"International journal of applied earth observation and geoinformation : ITC journal\",\"volume\":\"143 \",\"pages\":\"Article 104827\"},\"PeriodicalIF\":8.6000,\"publicationDate\":\"2025-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of applied earth observation and geoinformation : ITC journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1569843225004741\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"REMOTE SENSING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843225004741","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
SMAF-net: semantics-guided modality transfer and hierarchical feature fusion for optical-SAR image registration
Accurate registration of optical and synthetic aperture radar (SAR) images is critical for effective fusion in remote sensing applications. To address the significant radiometric and geometric differences between these modalities, SMAF-Net, a novel network that integrates semantics-guided modality transfer and hierarchical feature fusion for optical-SAR image registration, is proposed. For modality transfer, a feature-constrained generative adversarial module (SGMT) is used to translate SAR to pseudo-optical images. By incorporating deep features from a multiscale feature learning module (MFLM) as semantic constraints, the translated images preserve structural details and reduce modality discrepancies. For feature matching, a channel attention-based hierarchical aggregation module (CA-HAM) is designed to effectively fuse multi-level features. Combined with a joint detection-description strategy, the network enables accurate keypoint detection and descriptor extraction. Experiments on optical-SAR datasets show that the proposed method achieves an average registration error of 2.26 pixels, outperforming state-of-the-art (SOTA) methods and enabling accurate registration between optical and SAR images.
期刊介绍:
The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.