Yanglangxing He , Xueliang Zhang , Pengfeng Xiao , Zhenshi Li , Dilxat Muhtar , Feng Gu , Binxiao Liu , Pengming Feng
{"title":"Dynamic fusion of medium-resolution optical and SAR imagery for methane source infrastructure classification","authors":"Yanglangxing He , Xueliang Zhang , Pengfeng Xiao , Zhenshi Li , Dilxat Muhtar , Feng Gu , Binxiao Liu , Pengming Feng","doi":"10.1016/j.jag.2025.104876","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate classification of methane source infrastructure across sectors is critical for building comprehensive emission inventories and tracing emission sources. Existing approaches predominantly rely on high-resolution remote sensing imagery to capture discriminative features, but their scalability is limited by high costs and restricted availability. In contrast, medium-resolution imagery offers scalable alternatives with enhanced spectral signatures, while its lower spatial resolution challenges precise characterization and facility differentiation. To address this issue, we propose a multimodal fusion method on Sentinel-2 and Sentinel-1 data, with the aim of exploiting the complementary characteristics of optical, infrared, and SAR imagery to improve classification accuracy. We present a multimodal dynamic fusion network (DMFNet), which incorporates a gating module and multimodal attention fusion modules (MAFM) to adaptively address sample variability and multimodal heterogeneity. Additionally, DMFNet enables tracking and interpreting the fusion process by analyzing data-driven weights, providing deep insights into modality combinations and fusion strategies for specific facility. Experiments on the METER-ML dataset demonstrate that the proposed model achieves a precision of 0.740 and a recall of 0.757, outperforming existing single-modal and static fusion methods. Transferability experiments further confirm the practical applicability of the proposed method and its complementarity with existing open-source data in improving methane emission inventories.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"144 ","pages":"Article 104876"},"PeriodicalIF":8.6000,"publicationDate":"2025-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843225005230","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate classification of methane source infrastructure across sectors is critical for building comprehensive emission inventories and tracing emission sources. Existing approaches predominantly rely on high-resolution remote sensing imagery to capture discriminative features, but their scalability is limited by high costs and restricted availability. In contrast, medium-resolution imagery offers scalable alternatives with enhanced spectral signatures, while its lower spatial resolution challenges precise characterization and facility differentiation. To address this issue, we propose a multimodal fusion method on Sentinel-2 and Sentinel-1 data, with the aim of exploiting the complementary characteristics of optical, infrared, and SAR imagery to improve classification accuracy. We present a multimodal dynamic fusion network (DMFNet), which incorporates a gating module and multimodal attention fusion modules (MAFM) to adaptively address sample variability and multimodal heterogeneity. Additionally, DMFNet enables tracking and interpreting the fusion process by analyzing data-driven weights, providing deep insights into modality combinations and fusion strategies for specific facility. Experiments on the METER-ML dataset demonstrate that the proposed model achieves a precision of 0.740 and a recall of 0.757, outperforming existing single-modal and static fusion methods. Transferability experiments further confirm the practical applicability of the proposed method and its complementarity with existing open-source data in improving methane emission inventories.
期刊介绍:
The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.