Spectral mamba enhanced neighborhood attention network for aerial hyperspectral image classification

IF 2.8 3区 地球科学 Q2 ASTRONOMY & ASTROPHYSICS
Chuanzhi Wang, Mingyun Lv, Jun Huang, Yongmei Wu, Ruiru Qin
{"title":"Spectral mamba enhanced neighborhood attention network for aerial hyperspectral image classification","authors":"Chuanzhi Wang,&nbsp;Mingyun Lv,&nbsp;Jun Huang,&nbsp;Yongmei Wu,&nbsp;Ruiru Qin","doi":"10.1016/j.asr.2025.04.081","DOIUrl":null,"url":null,"abstract":"<div><div>Hyperspectral images (HSIs) are celebrated for their rich spectral information, making them highly effective for precise land cover classification. Deep neural networks, such as vision transformers (ViTs) and state space models (Mamba), have made significant advancements in hyperspectral image classification (HSIC). However, ViTs are often limited by their quadratic computational complexity and a predominant focus on global information, which can hinder their ability to extract crucial local features essential for HSIC. While Mamba-based architectures offer linear computational complexity and impressive performance, they are constrained by their limited understanding of the spatial and spectral information in HSIs. To address these limitations, we propose a novel spectral Mamba-enhanced neighborhood attention (SMENA) hybrid network, designed to effectively leverage the strengths of various architectures. This network integrates a local spatial feature extraction (LSFE) module with a spectral Mamba (SpeM) specifically for HSIC. The bidirectional scanning mechanism in SpeM enhances its ability to capture discriminative spectral features, while the LSFE, composed of convolutional and neighborhood attention modules, hierarchically captures detailed local spatial features. Extensive experiments on four widely used public datasets demonstrate that our model achieves superior classification performance compared to other eight benchmark methods.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"76 2","pages":"Pages 633-649"},"PeriodicalIF":2.8000,"publicationDate":"2025-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117725004521","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Hyperspectral images (HSIs) are celebrated for their rich spectral information, making them highly effective for precise land cover classification. Deep neural networks, such as vision transformers (ViTs) and state space models (Mamba), have made significant advancements in hyperspectral image classification (HSIC). However, ViTs are often limited by their quadratic computational complexity and a predominant focus on global information, which can hinder their ability to extract crucial local features essential for HSIC. While Mamba-based architectures offer linear computational complexity and impressive performance, they are constrained by their limited understanding of the spatial and spectral information in HSIs. To address these limitations, we propose a novel spectral Mamba-enhanced neighborhood attention (SMENA) hybrid network, designed to effectively leverage the strengths of various architectures. This network integrates a local spatial feature extraction (LSFE) module with a spectral Mamba (SpeM) specifically for HSIC. The bidirectional scanning mechanism in SpeM enhances its ability to capture discriminative spectral features, while the LSFE, composed of convolutional and neighborhood attention modules, hierarchically captures detailed local spatial features. Extensive experiments on four widely used public datasets demonstrate that our model achieves superior classification performance compared to other eight benchmark methods.
光谱曼巴增强的邻域关注网络用于航空高光谱图像分类
高光谱图像(hsi)以其丰富的光谱信息而闻名,使其对精确的土地覆盖分类非常有效。深度神经网络,如视觉变换(ViTs)和状态空间模型(Mamba),在高光谱图像分类(HSIC)方面取得了重大进展。然而,ViTs通常受到二次计算复杂性和主要关注全局信息的限制,这可能会阻碍其提取HSIC所需的关键局部特征的能力。虽然基于mamba的架构提供了线性计算复杂性和令人印象深刻的性能,但它们受限于对hsi中空间和光谱信息的有限理解。为了解决这些限制,我们提出了一种新的频谱曼巴增强邻域注意(SMENA)混合网络,旨在有效地利用各种架构的优势。该网络集成了局部空间特征提取(LSFE)模块和专门用于HSIC的光谱曼巴(SpeM)模块。SpeM的双向扫描机制增强了其捕获判别光谱特征的能力,而LSFE由卷积和邻域关注模块组成,分层捕获详细的局部空间特征。在四个广泛使用的公共数据集上进行的大量实验表明,与其他八种基准方法相比,我们的模型具有更好的分类性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Advances in Space Research
Advances in Space Research 地学天文-地球科学综合
CiteScore
5.20
自引率
11.50%
发文量
800
审稿时长
5.8 months
期刊介绍: The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc. NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR). All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信