CASSNet: Cross-Attention Enhanced Spectral–Spatial Interaction Network for Hyperspectral Image Super-Resolution

IF 4.7 2区 地球科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Zhanxu Zhang;Linzi Yang;Guanglian Zhang;Jiangwei Deng;Lifeng Bian;Chen Yang
{"title":"CASSNet: Cross-Attention Enhanced Spectral–Spatial Interaction Network for Hyperspectral Image Super-Resolution","authors":"Zhanxu Zhang;Linzi Yang;Guanglian Zhang;Jiangwei Deng;Lifeng Bian;Chen Yang","doi":"10.1109/JSTARS.2025.3564379","DOIUrl":null,"url":null,"abstract":"Deep-learning-based super-resolution (SR) methods for a single hyperspectral image have made significant progress in recent years and become an important research direction in remote sensing. Existing methods perform well in extracting spatial features, but challenges remain in integrating spectral and spatial features when modeling global relationships. In order to take full advantage of the higher spectral resolution of hyperspectral images, this article proposes a novel hyperspectral image SR method (CASSNet), which integrates convolutional neural networks and cross-attention mechanisms into a unified framework. This approach achieves comprehensive integration of spectral and spatial information, with extensive exploration at both local and global levels. In the local feature extraction stage, parallel 3-D/2-D convolutions work in tandem to efficiently capture detail information from both spectral and spatial dimensions. In addition, a spectral–spatial dual-branch module employing the cross-attention mechanism is designed to capture the global dependencies within the features, where the reconstructed spectral–spatial module and the spectral–spatial interaction unit can effectively promote the interaction and complementarity of spectral–spatial features. The experiments on three publicly available datasets demonstrated that the proposed method obtained superior SR results, outperforming state-of-the-art SR algorithms.","PeriodicalId":13116,"journal":{"name":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","volume":"18 ","pages":"11716-11730"},"PeriodicalIF":4.7000,"publicationDate":"2025-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10979241","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10979241/","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Deep-learning-based super-resolution (SR) methods for a single hyperspectral image have made significant progress in recent years and become an important research direction in remote sensing. Existing methods perform well in extracting spatial features, but challenges remain in integrating spectral and spatial features when modeling global relationships. In order to take full advantage of the higher spectral resolution of hyperspectral images, this article proposes a novel hyperspectral image SR method (CASSNet), which integrates convolutional neural networks and cross-attention mechanisms into a unified framework. This approach achieves comprehensive integration of spectral and spatial information, with extensive exploration at both local and global levels. In the local feature extraction stage, parallel 3-D/2-D convolutions work in tandem to efficiently capture detail information from both spectral and spatial dimensions. In addition, a spectral–spatial dual-branch module employing the cross-attention mechanism is designed to capture the global dependencies within the features, where the reconstructed spectral–spatial module and the spectral–spatial interaction unit can effectively promote the interaction and complementarity of spectral–spatial features. The experiments on three publicly available datasets demonstrated that the proposed method obtained superior SR results, outperforming state-of-the-art SR algorithms.
CASSNet:用于高光谱图像超分辨率的交叉关注增强光谱-空间交互网络
基于深度学习的单幅高光谱图像超分辨方法近年来取得了重大进展,成为遥感领域的一个重要研究方向。现有方法在提取空间特征方面表现良好,但在建模全局关系时,如何将光谱和空间特征整合起来仍然存在挑战。为了充分利用高光谱图像较高的光谱分辨率,本文提出了一种新的高光谱图像SR方法(CASSNet),该方法将卷积神经网络和交叉注意机制整合到一个统一的框架中。该方法实现了光谱和空间信息的综合集成,在局部和全球层面进行了广泛的探索。在局部特征提取阶段,并行3-D/2-D卷积协同工作,从光谱和空间两个维度高效捕获细节信息。此外,采用交叉关注机制设计了光谱-空间双分支模块,捕获特征内部的全局依赖关系,其中重构的光谱-空间模块和光谱-空间交互单元可以有效促进光谱-空间特征的交互和互补。在三个公开可用的数据集上的实验表明,该方法获得了更好的SR结果,优于目前最先进的SR算法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.30
自引率
10.90%
发文量
563
审稿时长
4.7 months
期刊介绍: The IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing addresses the growing field of applications in Earth observations and remote sensing, and also provides a venue for the rapidly expanding special issues that are being sponsored by the IEEE Geosciences and Remote Sensing Society. The journal draws upon the experience of the highly successful “IEEE Transactions on Geoscience and Remote Sensing” and provide a complementary medium for the wide range of topics in applied earth observations. The ‘Applications’ areas encompasses the societal benefit areas of the Global Earth Observations Systems of Systems (GEOSS) program. Through deliberations over two years, ministers from 50 countries agreed to identify nine areas where Earth observation could positively impact the quality of life and health of their respective countries. Some of these are areas not traditionally addressed in the IEEE context. These include biodiversity, health and climate. Yet it is the skill sets of IEEE members, in areas such as observations, communications, computers, signal processing, standards and ocean engineering, that form the technical underpinnings of GEOSS. Thus, the Journal attracts a broad range of interests that serves both present members in new ways and expands the IEEE visibility into new areas.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信