{"title":"UNMamba: Cascaded Spatial–Spectral Mamba for Blind Hyperspectral Unmixing","authors":"Dong Chen;Junping Zhang;Jiaxin Li","doi":"10.1109/LGRS.2025.3545505","DOIUrl":null,"url":null,"abstract":"Blind hyperspectral unmixing (HU) has advanced significantly with the emergence of deep learning-based methods. However, the localized operations of convolutional neural networks (CNNs) and the high computational demands of Transformers present challenges for blind HU. This necessitates the development of image-level unmixing methods capable of capturing long-range spatial-spectral dependencies with low computational demands. This letter proposes a cascaded spatial-spectral Mamba model, termed UNMamba, which leverages the strengths of Mamba to efficiently model long-range spatial-spectral dependencies with linear computational complexity, achieving superior image-level unmixing performance with small parameters and operations. Specifically, UNMamba first captures long-range spatial dependencies, followed by the extraction of global spectral features, forming long-range spatial-spectral dependencies, which are subsequently mapped into abundance maps. Then, the input image is reconstructed using the linear mixing model (LMM), incorporating weighted averages of multiple trainable random sequences and an endmember loss to learn endmembers. UNMamba is the first unmixing approach that introduces the state-space models (SSMs). Extensive experimental results demonstrate that, without relying on any endmember initialization techniques [such as vertex component analysis (VCA)], the proposed UNMamba achieves significantly high unmixing accuracy, outperforming state-of-the-art methods. Codes are available at <uri>https://github.com/Preston-Dong/UNMamba</uri>.","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10902420/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Blind hyperspectral unmixing (HU) has advanced significantly with the emergence of deep learning-based methods. However, the localized operations of convolutional neural networks (CNNs) and the high computational demands of Transformers present challenges for blind HU. This necessitates the development of image-level unmixing methods capable of capturing long-range spatial-spectral dependencies with low computational demands. This letter proposes a cascaded spatial-spectral Mamba model, termed UNMamba, which leverages the strengths of Mamba to efficiently model long-range spatial-spectral dependencies with linear computational complexity, achieving superior image-level unmixing performance with small parameters and operations. Specifically, UNMamba first captures long-range spatial dependencies, followed by the extraction of global spectral features, forming long-range spatial-spectral dependencies, which are subsequently mapped into abundance maps. Then, the input image is reconstructed using the linear mixing model (LMM), incorporating weighted averages of multiple trainable random sequences and an endmember loss to learn endmembers. UNMamba is the first unmixing approach that introduces the state-space models (SSMs). Extensive experimental results demonstrate that, without relying on any endmember initialization techniques [such as vertex component analysis (VCA)], the proposed UNMamba achieves significantly high unmixing accuracy, outperforming state-of-the-art methods. Codes are available at https://github.com/Preston-Dong/UNMamba.