Bowen Chen;Keyan Chen;Mohan Yang;Zhengxia Zou;Zhenwei Shi
{"title":"Heterogeneous Mixture of Experts for Remote Sensing Image Super-Resolution","authors":"Bowen Chen;Keyan Chen;Mohan Yang;Zhengxia Zou;Zhenwei Shi","doi":"10.1109/LGRS.2025.3557928","DOIUrl":null,"url":null,"abstract":"Remote sensing image super-resolution (SR) aims to reconstruct high-resolution (HR) remote sensing images from low-resolution (LR) inputs, thereby addressing limitations imposed by sensors and imaging conditions. However, the inherent characteristics of remote sensing images, including diverse ground object types and complex details, pose significant challenges to achieving high-quality reconstruction. Existing methods typically use a uniform structure to process various types of ground objects without distinction, making it difficult to adapt to the complex characteristics of remote sensing images. To address this issue, we introduce a mixture-of-experts (MoE) model and design a set of heterogeneous experts. These experts are organized into multiple expert groups, where experts within each group are homogeneous while being heterogeneous across groups. This design ensures that specialized activation parameters can be used to handle the diverse and intricate details of ground objects effectively. To better accommodate the heterogeneous experts, we propose a multilevel feature aggregation (MFA) strategy to guide the routing process. In addition, we develop a dual-routing mechanism to adaptively select the optimal expert for each pixel. Experiments conducted on the UCMerced and AID datasets demonstrate that our proposed method achieves superior SR reconstruction accuracy compared with state-of-the-art methods. The code will be available at <uri>https://github.com/Mr-Bamboo/MFG-HMoE</uri>","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10949132/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Remote sensing image super-resolution (SR) aims to reconstruct high-resolution (HR) remote sensing images from low-resolution (LR) inputs, thereby addressing limitations imposed by sensors and imaging conditions. However, the inherent characteristics of remote sensing images, including diverse ground object types and complex details, pose significant challenges to achieving high-quality reconstruction. Existing methods typically use a uniform structure to process various types of ground objects without distinction, making it difficult to adapt to the complex characteristics of remote sensing images. To address this issue, we introduce a mixture-of-experts (MoE) model and design a set of heterogeneous experts. These experts are organized into multiple expert groups, where experts within each group are homogeneous while being heterogeneous across groups. This design ensures that specialized activation parameters can be used to handle the diverse and intricate details of ground objects effectively. To better accommodate the heterogeneous experts, we propose a multilevel feature aggregation (MFA) strategy to guide the routing process. In addition, we develop a dual-routing mechanism to adaptively select the optimal expert for each pixel. Experiments conducted on the UCMerced and AID datasets demonstrate that our proposed method achieves superior SR reconstruction accuracy compared with state-of-the-art methods. The code will be available at https://github.com/Mr-Bamboo/MFG-HMoE