A structure-prior guided adaptive context selection network for remote sensing semantic segmentation

IF 0.7 4区 工程技术 Q4 ENGINEERING, ELECTRICAL & ELECTRONIC
Shengjun Xu, Rui Shen, Erhu Liu, Zongfang Ma, Miao Du, Jun Liu, Bohan Zhan
{"title":"A structure-prior guided adaptive context selection network for remote sensing semantic segmentation","authors":"Shengjun Xu,&nbsp;Rui Shen,&nbsp;Erhu Liu,&nbsp;Zongfang Ma,&nbsp;Miao Du,&nbsp;Jun Liu,&nbsp;Bohan Zhan","doi":"10.1049/ell2.70161","DOIUrl":null,"url":null,"abstract":"<p>In remote sensing image segmentation, recognizing buildings is challenging when the visual evidence from pixels is weak or when buildings belong to small, spatially structured objects. To address this issue, a structure-prior guided adaptive context selection network (SGACS-Net) is proposed for remote sensing semantic segmentation. The core is to use structure-prior knowledge to dynamically capture prior contextual information and higher-order object structural features, thereby improving the accuracy of remote sensing building segmentation. First, an adaptive context selection module is designed. By dynamically adjusting the spatial sensing field, this module effectively models the global long-range context information dependencies. It captures varying context information of buildings at different scales, thereby enhancing the network's ability to extract building feature representations. Second, a structure-prior guided variable loss function is proposed. It utilizes the structural features of building points, lines, and surface to identify key regions. By leveraging advanced structure-prior knowledge, it enhances the network's ability to express structural features. Experimental results on two datasets show that the proposed SGACS-Net outperforms other typical and state-of-the-art methods in terms of remote sensing semantic segmentation performance.</p>","PeriodicalId":11556,"journal":{"name":"Electronics Letters","volume":"61 1","pages":""},"PeriodicalIF":0.7000,"publicationDate":"2025-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/ell2.70161","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics Letters","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/ell2.70161","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

In remote sensing image segmentation, recognizing buildings is challenging when the visual evidence from pixels is weak or when buildings belong to small, spatially structured objects. To address this issue, a structure-prior guided adaptive context selection network (SGACS-Net) is proposed for remote sensing semantic segmentation. The core is to use structure-prior knowledge to dynamically capture prior contextual information and higher-order object structural features, thereby improving the accuracy of remote sensing building segmentation. First, an adaptive context selection module is designed. By dynamically adjusting the spatial sensing field, this module effectively models the global long-range context information dependencies. It captures varying context information of buildings at different scales, thereby enhancing the network's ability to extract building feature representations. Second, a structure-prior guided variable loss function is proposed. It utilizes the structural features of building points, lines, and surface to identify key regions. By leveraging advanced structure-prior knowledge, it enhances the network's ability to express structural features. Experimental results on two datasets show that the proposed SGACS-Net outperforms other typical and state-of-the-art methods in terms of remote sensing semantic segmentation performance.

Abstract Image

用于遥感语义分割的结构先导自适应上下文选择网络
在遥感图像分割中,当来自像素的视觉证据较弱或当建筑物属于小的空间结构物体时,识别建筑物是具有挑战性的。为了解决这一问题,提出了一种基于结构优先引导的自适应上下文选择网络(SGACS-Net)用于遥感语义分割。其核心是利用结构先验知识动态捕获先验上下文信息和高阶目标结构特征,从而提高遥感建筑分割的精度。首先,设计了自适应上下文选择模块。该模块通过动态调整空间感知场,有效地建立了全局远程上下文信息依赖关系模型。它捕获了不同尺度下建筑物的不同上下文信息,从而增强了网络提取建筑物特征表示的能力。其次,提出了一种结构先验引导变量损失函数。它利用建筑点、线、面的结构特征来识别关键区域。通过利用先进的结构先验知识,增强了网络表达结构特征的能力。在两个数据集上的实验结果表明,所提出的sgars - net在遥感语义分割性能方面优于其他典型和最新的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Electronics Letters
Electronics Letters 工程技术-工程:电子与电气
CiteScore
2.70
自引率
0.00%
发文量
268
审稿时长
3.6 months
期刊介绍: Electronics Letters is an internationally renowned peer-reviewed rapid-communication journal that publishes short original research papers every two weeks. Its broad and interdisciplinary scope covers the latest developments in all electronic engineering related fields including communication, biomedical, optical and device technologies. Electronics Letters also provides further insight into some of the latest developments through special features and interviews. Scope As a journal at the forefront of its field, Electronics Letters publishes papers covering all themes of electronic and electrical engineering. The major themes of the journal are listed below. Antennas and Propagation Biomedical and Bioinspired Technologies, Signal Processing and Applications Control Engineering Electromagnetism: Theory, Materials and Devices Electronic Circuits and Systems Image, Video and Vision Processing and Applications Information, Computing and Communications Instrumentation and Measurement Microwave Technology Optical Communications Photonics and Opto-Electronics Power Electronics, Energy and Sustainability Radar, Sonar and Navigation Semiconductor Technology Signal Processing MIMO
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信