Kelong Tu , Chao Yang , Yaxian Qing , Kunlun Qi , Nengcheng Chen , Jianya Gong
{"title":"Cloud removal with optical and SAR imagery via multimodal similarity attention","authors":"Kelong Tu , Chao Yang , Yaxian Qing , Kunlun Qi , Nengcheng Chen , Jianya Gong","doi":"10.1016/j.isprsjprs.2025.05.004","DOIUrl":null,"url":null,"abstract":"<div><div>Optical remote sensing images are crucial data sources for various applications, including agricultural monitoring, land cover classification, and urban planning. However, cloud cover often hinders their effectiveness, which poses a significant challenge to downstream tasks. To address this issue, we introduce the Similarity-based Multimodal De-Clouding Network (SMDCNet), an innovative framework that enhances the quality of optical remote sensing images by utilizing multimodal similarity attention to integrate complementary information from synthetic aperture radar (SAR) imagery. First, we introduce a similarity feature attention (SFA) module that explores the similarity between optical and SAR features, aligning these cross-domain features to guide the optical encoder’s focus on cloud-free regions for more accurate feature alignment. Building on this, we propose a differential feature extraction (DFE) module that selectively uses SAR features to compensate for cloud-covered regions in the optical images. To mitigate the blurriness in the de-clouded images, we incorporate differential characteristics injection (DCI) and multi-scale feature fusion (MSFF) modules, which collaboratively enhance the reconstruction of detailed information. Our experiments on the SEN12MS-CR dataset demonstrate that SMDCNet effectively restores high-quality cloud-free images, achieving a PSNR of 30.2759 dB, outperforming state-of-the-art cloud removal techniques.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"226 ","pages":"Pages 116-126"},"PeriodicalIF":10.6000,"publicationDate":"2025-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625001856","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Optical remote sensing images are crucial data sources for various applications, including agricultural monitoring, land cover classification, and urban planning. However, cloud cover often hinders their effectiveness, which poses a significant challenge to downstream tasks. To address this issue, we introduce the Similarity-based Multimodal De-Clouding Network (SMDCNet), an innovative framework that enhances the quality of optical remote sensing images by utilizing multimodal similarity attention to integrate complementary information from synthetic aperture radar (SAR) imagery. First, we introduce a similarity feature attention (SFA) module that explores the similarity between optical and SAR features, aligning these cross-domain features to guide the optical encoder’s focus on cloud-free regions for more accurate feature alignment. Building on this, we propose a differential feature extraction (DFE) module that selectively uses SAR features to compensate for cloud-covered regions in the optical images. To mitigate the blurriness in the de-clouded images, we incorporate differential characteristics injection (DCI) and multi-scale feature fusion (MSFF) modules, which collaboratively enhance the reconstruction of detailed information. Our experiments on the SEN12MS-CR dataset demonstrate that SMDCNet effectively restores high-quality cloud-free images, achieving a PSNR of 30.2759 dB, outperforming state-of-the-art cloud removal techniques.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.