MGFNet: An MLP-dominated gated fusion network for semantic segmentation of high-resolution multi-modal remote sensing images

IF 7.6 Q1 REMOTE SENSING
Kan Wei , JinKun Dai , Danfeng Hong , Yuanxin Ye
{"title":"MGFNet: An MLP-dominated gated fusion network for semantic segmentation of high-resolution multi-modal remote sensing images","authors":"Kan Wei ,&nbsp;JinKun Dai ,&nbsp;Danfeng Hong ,&nbsp;Yuanxin Ye","doi":"10.1016/j.jag.2024.104241","DOIUrl":null,"url":null,"abstract":"<div><div>The heterogeneity and complexity of multimodal data in high-resolution remote sensing images significantly challenges existing cross-modal networks in fusing the complementary information of high-resolution optical and synthetic aperture radar (SAR) images for precise semantic segmentation. To address this issue, this paper proposes a multi-layer perceptron (MLP) dominated gate fusion network (MGFNet). MGFNet consists of three modules: a multi-path feature extraction network, an MLP-gate fusion (MGF) module, and a decoder. Initially, MGFNet independently extracts features from high-resolution optical and SAR images while preserving spatial information. Then, the well-designed MGF module combines the multi-modal features through channel attention and gated fusion stages, utilizing MLP as a gate to exploit complementary information and filter redundant data. Additionally, we introduce a novel high-resolution multimodal remote sensing dataset, YESeg-OPT-SAR, with a spatial resolution of 0.5 m. To evaluate MGFNet, we compare it with several state-of-the-art (SOTA) models using YESeg-OPT-SAR and Pohang datasets, both of which are high-resolution multi-modal datasets. The experimental results demonstrate that MGFNet achieves higher evaluation metrics compared to other models, indicating its effectiveness in multi-modal feature fusion for segmentation. The source code and data are available at <span><span>https://github.com/yeyuanxin110/YESeg-OPT-SAR</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"135 ","pages":"Article 104241"},"PeriodicalIF":7.6000,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843224005971","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

Abstract

The heterogeneity and complexity of multimodal data in high-resolution remote sensing images significantly challenges existing cross-modal networks in fusing the complementary information of high-resolution optical and synthetic aperture radar (SAR) images for precise semantic segmentation. To address this issue, this paper proposes a multi-layer perceptron (MLP) dominated gate fusion network (MGFNet). MGFNet consists of three modules: a multi-path feature extraction network, an MLP-gate fusion (MGF) module, and a decoder. Initially, MGFNet independently extracts features from high-resolution optical and SAR images while preserving spatial information. Then, the well-designed MGF module combines the multi-modal features through channel attention and gated fusion stages, utilizing MLP as a gate to exploit complementary information and filter redundant data. Additionally, we introduce a novel high-resolution multimodal remote sensing dataset, YESeg-OPT-SAR, with a spatial resolution of 0.5 m. To evaluate MGFNet, we compare it with several state-of-the-art (SOTA) models using YESeg-OPT-SAR and Pohang datasets, both of which are high-resolution multi-modal datasets. The experimental results demonstrate that MGFNet achieves higher evaluation metrics compared to other models, indicating its effectiveness in multi-modal feature fusion for segmentation. The source code and data are available at https://github.com/yeyuanxin110/YESeg-OPT-SAR.
MGFNet:用于高分辨率多模态遥感图像语义分割的 MLP 主导门控融合网络
高分辨率遥感图像中多模态数据的异质性和复杂性极大地挑战了现有跨模态网络融合高分辨率光学和合成孔径雷达(SAR)图像的互补信息以进行精确语义分割的能力。为解决这一问题,本文提出了一种多层感知器(MLP)主导的门融合网络(MGFNet)。MGFNet 由三个模块组成:多路径特征提取网络、MLP-门融合(MGF)模块和解码器。首先,MGFNet 从高分辨率光学和合成孔径雷达图像中独立提取特征,同时保留空间信息。然后,精心设计的 MGF 模块通过通道注意和门控融合阶段将多模态特征结合起来,利用 MLP 作为门来利用互补信息并过滤冗余数据。为了对 MGFNet 进行评估,我们使用 YESeg-OPT-SAR 和 Pohang 数据集(这两个数据集都是高分辨率多模态数据集)将其与几种最先进的(SOTA)模型进行了比较。实验结果表明,与其他模型相比,MGFNet 获得了更高的评价指标,这表明它在多模态特征融合分割方面非常有效。源代码和数据可在 https://github.com/yeyuanxin110/YESeg-OPT-SAR 上获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International journal of applied earth observation and geoinformation : ITC journal
International journal of applied earth observation and geoinformation : ITC journal Global and Planetary Change, Management, Monitoring, Policy and Law, Earth-Surface Processes, Computers in Earth Sciences
CiteScore
12.00
自引率
0.00%
发文量
0
审稿时长
77 days
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信