Building Extraction From High-Resolution Multispectral and SAR Images Using a Boundary-Link Multimodal Fusion Network

IF 5.3 2区 地球科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Zhe Zhao;Boya Zhao;Yuanfeng Wu;Zutian He;Lianru Gao
{"title":"Building Extraction From High-Resolution Multispectral and SAR Images Using a Boundary-Link Multimodal Fusion Network","authors":"Zhe Zhao;Boya Zhao;Yuanfeng Wu;Zutian He;Lianru Gao","doi":"10.1109/JSTARS.2025.3525709","DOIUrl":null,"url":null,"abstract":"Automatically extracting buildings with high precision from remote sensing images is crucial for various applications. Due to their distinct imaging modalities and complementary characteristics, optical and synthetic aperture radar (SAR) images serve as primary data sources for this task. We propose a novel boundary-link multimodal fusion network for joint semantic segmentation to leverage the information in these images. An initial building extraction result is obtained from the multimodal fusion network, followed by refinement using building boundaries. The model achieves high-precision building delineation by leveraging building boundary and semantic information from optical and SAR images. It distinguishes buildings from the background in complex environments, such as dense urban areas or regions with mixed vegetation, particularly when small buildings lack distinct texture or color features. We conducted experiments using the MSAW dataset (RGB-NIR and SAR data) and DFC track2 datasets (RGB and SAR data). The results indicate that our model significantly enhances extraction accuracy and improves building boundary delineation. The intersection over union metric is 2.5% to 3.5% higher than that of other multimodal joint segmentation methods.","PeriodicalId":13116,"journal":{"name":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","volume":"18 ","pages":"3864-3878"},"PeriodicalIF":5.3000,"publicationDate":"2025-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10824925","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10824925/","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Automatically extracting buildings with high precision from remote sensing images is crucial for various applications. Due to their distinct imaging modalities and complementary characteristics, optical and synthetic aperture radar (SAR) images serve as primary data sources for this task. We propose a novel boundary-link multimodal fusion network for joint semantic segmentation to leverage the information in these images. An initial building extraction result is obtained from the multimodal fusion network, followed by refinement using building boundaries. The model achieves high-precision building delineation by leveraging building boundary and semantic information from optical and SAR images. It distinguishes buildings from the background in complex environments, such as dense urban areas or regions with mixed vegetation, particularly when small buildings lack distinct texture or color features. We conducted experiments using the MSAW dataset (RGB-NIR and SAR data) and DFC track2 datasets (RGB and SAR data). The results indicate that our model significantly enhances extraction accuracy and improves building boundary delineation. The intersection over union metric is 2.5% to 3.5% higher than that of other multimodal joint segmentation methods.
基于边界链接多模态融合网络的高分辨率多光谱和SAR图像建筑物提取
从遥感影像中高精度地自动提取建筑物在各种应用中都是至关重要的。由于光学和合成孔径雷达(SAR)图像具有不同的成像方式和互补特性,因此可以作为该任务的主要数据源。我们提出了一种新的边界链接多模态融合网络,用于联合语义分割,以利用这些图像中的信息。从多模态融合网络中获得初始的建筑物提取结果,然后利用建筑物边界进行细化。该模型利用光学和SAR图像中的建筑边界和语义信息,实现了高精度的建筑物描绘。在复杂的环境中,如密集的城市地区或混合植被的地区,特别是当小型建筑缺乏鲜明的纹理或色彩特征时,它可以将建筑物与背景区分开来。我们使用MSAW数据集(RGB- nir和SAR数据)和DFC track2数据集(RGB和SAR数据)进行了实验。结果表明,该模型显著提高了提取精度,改善了建筑物边界的描绘。与其他多模态联合分割方法相比,相交度比联合度高2.5% ~ 3.5%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.30
自引率
10.90%
发文量
563
审稿时长
4.7 months
期刊介绍: The IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing addresses the growing field of applications in Earth observations and remote sensing, and also provides a venue for the rapidly expanding special issues that are being sponsored by the IEEE Geosciences and Remote Sensing Society. The journal draws upon the experience of the highly successful “IEEE Transactions on Geoscience and Remote Sensing” and provide a complementary medium for the wide range of topics in applied earth observations. The ‘Applications’ areas encompasses the societal benefit areas of the Global Earth Observations Systems of Systems (GEOSS) program. Through deliberations over two years, ministers from 50 countries agreed to identify nine areas where Earth observation could positively impact the quality of life and health of their respective countries. Some of these are areas not traditionally addressed in the IEEE context. These include biodiversity, health and climate. Yet it is the skill sets of IEEE members, in areas such as observations, communications, computers, signal processing, standards and ocean engineering, that form the technical underpinnings of GEOSS. Thus, the Journal attracts a broad range of interests that serves both present members in new ways and expands the IEEE visibility into new areas.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书