An efficient hyperspectral image classification method using retentive network

IF 2.8 3区 地球科学 Q2 ASTRONOMY & ASTROPHYSICS
Rajat Kumar Arya, Subhojit Paul, Rajeev Srivastava
{"title":"An efficient hyperspectral image classification method using retentive network","authors":"Rajat Kumar Arya,&nbsp;Subhojit Paul,&nbsp;Rajeev Srivastava","doi":"10.1016/j.asr.2024.10.001","DOIUrl":null,"url":null,"abstract":"<div><div>In recent computer vision tasks, the vision transformer (ViT) has demonstrated competitive ability. However, ViT still has problems: the computational complexity of the self-attention layer leads to expensive and slow interference, and processing all tokens for high-resolution images may slow down due to the layer’s quadratic complexity. Recently, a retentive network with excellent performance, training parallelism, and an inexpensive inference cost was proposed. For hyperspectral image (HSI) classification, this paper proposes<!--> <!-->a retention-based network model called the HSI retentive network (HSIRN). The proposed model allows memory usage independent of the token’s sequence, facilitating the efficient processing of high-resolution images with low inference and computational costs. Although the retention encoder can extract global data, it pays limited attention to local data. A powerful tool for extracting local information is a convolutional neural network (CNN). The proposed HSIRN model uses a specific CNN-based block to extract local spectral-spatial information. To maintain degradation between successive vertical and horizontal positions with the depth dimension of the HSI, we propose a three-dimensional retention mechanism for the three-dimensional HSI dataset in the retention encoder. By efficiently using both local and global spectral-spatial information, the proposed method offers a potent tool for HSI classification. We evaluated the classification performance of the proposed HSIRN approach on four datasets through comprehensive examinations, and the results demonstrated its superiority over state-of-the-art methods. At <span><span>https://github.com/RajatArya22/HSIRN</span><svg><path></path></svg></span>, the source code will be available to the public.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"75 2","pages":"Pages 1701-1718"},"PeriodicalIF":2.8000,"publicationDate":"2025-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117724010081","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

In recent computer vision tasks, the vision transformer (ViT) has demonstrated competitive ability. However, ViT still has problems: the computational complexity of the self-attention layer leads to expensive and slow interference, and processing all tokens for high-resolution images may slow down due to the layer’s quadratic complexity. Recently, a retentive network with excellent performance, training parallelism, and an inexpensive inference cost was proposed. For hyperspectral image (HSI) classification, this paper proposes a retention-based network model called the HSI retentive network (HSIRN). The proposed model allows memory usage independent of the token’s sequence, facilitating the efficient processing of high-resolution images with low inference and computational costs. Although the retention encoder can extract global data, it pays limited attention to local data. A powerful tool for extracting local information is a convolutional neural network (CNN). The proposed HSIRN model uses a specific CNN-based block to extract local spectral-spatial information. To maintain degradation between successive vertical and horizontal positions with the depth dimension of the HSI, we propose a three-dimensional retention mechanism for the three-dimensional HSI dataset in the retention encoder. By efficiently using both local and global spectral-spatial information, the proposed method offers a potent tool for HSI classification. We evaluated the classification performance of the proposed HSIRN approach on four datasets through comprehensive examinations, and the results demonstrated its superiority over state-of-the-art methods. At https://github.com/RajatArya22/HSIRN, the source code will be available to the public.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Advances in Space Research
Advances in Space Research 地学天文-地球科学综合
CiteScore
5.20
自引率
11.50%
发文量
800
审稿时长
5.8 months
期刊介绍: The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc. NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR). All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信