Hui Wang , Youxiang Huang , Hao Huang , Yu Wang , Jun Li , Guan Gui
{"title":"Uncertainty-Aware Dynamic Fusion Network with Criss-Cross Attention for multimodal remote sensing land cover classification","authors":"Hui Wang , Youxiang Huang , Hao Huang , Yu Wang , Jun Li , Guan Gui","doi":"10.1016/j.inffus.2025.103249","DOIUrl":null,"url":null,"abstract":"<div><div>Multimodal Remote Sensing Land Cover Classification (LCC) is a promising technology that integrates multi-source remote sensing data to improve classification accuracy and robustness. However, existing multimodal fusion methods are primarily static, failing to account for the variability of information in each modality and sample. In this paper, we propose an Uncertainty-aware Dynamic Fusion Network (UDFNet) for multimodal land cover classification to address this issue. UDFNet consists of three modules: a feature extraction and propagation module, a global feature association module based on Criss-Cross Attention (CCA), and a dynamic fusion module that assigns weights to each modality based on their respective uncertainties, measured using energy scores. Extensive experiments conducted on the Berlin, Augsburg, MUUFL, and Trento public datasets show that our proposed method achieves superior performance compared to state-of-the-art approaches. Specifically, UDFNet achieved Overall Accuracy (OA) of 86.38%, 94.56%, 95.08%, and 99.76% on the Berlin, Augsburg, MUUFL, and Trento datasets, respectively, surpassing existing methods by up to 5% in OA. These results highlight the critical role of each module in enhancing classification accuracy and robustness.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"123 ","pages":"Article 103249"},"PeriodicalIF":14.7000,"publicationDate":"2025-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253525003227","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multimodal Remote Sensing Land Cover Classification (LCC) is a promising technology that integrates multi-source remote sensing data to improve classification accuracy and robustness. However, existing multimodal fusion methods are primarily static, failing to account for the variability of information in each modality and sample. In this paper, we propose an Uncertainty-aware Dynamic Fusion Network (UDFNet) for multimodal land cover classification to address this issue. UDFNet consists of three modules: a feature extraction and propagation module, a global feature association module based on Criss-Cross Attention (CCA), and a dynamic fusion module that assigns weights to each modality based on their respective uncertainties, measured using energy scores. Extensive experiments conducted on the Berlin, Augsburg, MUUFL, and Trento public datasets show that our proposed method achieves superior performance compared to state-of-the-art approaches. Specifically, UDFNet achieved Overall Accuracy (OA) of 86.38%, 94.56%, 95.08%, and 99.76% on the Berlin, Augsburg, MUUFL, and Trento datasets, respectively, surpassing existing methods by up to 5% in OA. These results highlight the critical role of each module in enhancing classification accuracy and robustness.
期刊介绍:
Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.