{"title":"小镜头高光谱图像分类知识精馏中粒度不匹配的克服","authors":"Hao Wu;Zhaohui Xue;Shaoguang Zhou;Hongjun Su","doi":"10.1109/TGRS.2025.3530614","DOIUrl":null,"url":null,"abstract":"Hyperspectral image classification (HSIC) often struggles due to the scarcity of labeled samples. Knowledge distillation (KD), including self-distillation (SD) where a model learns from its own predictions, has emerged as a promising solution. However, existing distillation methods in HSIC face a “granularity mismatch” problem as they rely on coarse, patch-level data for fine-grained, pixel-level classification, which introduces label noise and causes misclassification. To overcome this issue, we propose central spectral self-distillation (CSSD), a framework that isolates pure spectral information at the patch center and leverages it for SD. CSSD consists of three main components. First, the backbone network separates spectral and spatial feature processing to extract pure central spectral features. Second, a spectral refiner module enhances these spectral features before integrating spatial context. Finally, an SD loss aligns the final predictions with the central spectral guidance, ensuring granularity matching at the pixel level. The experimental results on five hyperspectral datasets demonstrate the effectiveness of CSSD under few-shot conditions. The source code will be available online at <uri>https://github.com/ZhaohuiXue/CSSD</uri>.","PeriodicalId":13213,"journal":{"name":"IEEE Transactions on Geoscience and Remote Sensing","volume":"63 ","pages":"1-17"},"PeriodicalIF":8.6000,"publicationDate":"2025-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Overcoming Granularity Mismatch in Knowledge Distillation for Few-Shot Hyperspectral Image Classification\",\"authors\":\"Hao Wu;Zhaohui Xue;Shaoguang Zhou;Hongjun Su\",\"doi\":\"10.1109/TGRS.2025.3530614\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hyperspectral image classification (HSIC) often struggles due to the scarcity of labeled samples. Knowledge distillation (KD), including self-distillation (SD) where a model learns from its own predictions, has emerged as a promising solution. However, existing distillation methods in HSIC face a “granularity mismatch” problem as they rely on coarse, patch-level data for fine-grained, pixel-level classification, which introduces label noise and causes misclassification. To overcome this issue, we propose central spectral self-distillation (CSSD), a framework that isolates pure spectral information at the patch center and leverages it for SD. CSSD consists of three main components. First, the backbone network separates spectral and spatial feature processing to extract pure central spectral features. Second, a spectral refiner module enhances these spectral features before integrating spatial context. Finally, an SD loss aligns the final predictions with the central spectral guidance, ensuring granularity matching at the pixel level. The experimental results on five hyperspectral datasets demonstrate the effectiveness of CSSD under few-shot conditions. The source code will be available online at <uri>https://github.com/ZhaohuiXue/CSSD</uri>.\",\"PeriodicalId\":13213,\"journal\":{\"name\":\"IEEE Transactions on Geoscience and Remote Sensing\",\"volume\":\"63 \",\"pages\":\"1-17\"},\"PeriodicalIF\":8.6000,\"publicationDate\":\"2025-01-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Geoscience and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10843769/\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Geoscience and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10843769/","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Overcoming Granularity Mismatch in Knowledge Distillation for Few-Shot Hyperspectral Image Classification
Hyperspectral image classification (HSIC) often struggles due to the scarcity of labeled samples. Knowledge distillation (KD), including self-distillation (SD) where a model learns from its own predictions, has emerged as a promising solution. However, existing distillation methods in HSIC face a “granularity mismatch” problem as they rely on coarse, patch-level data for fine-grained, pixel-level classification, which introduces label noise and causes misclassification. To overcome this issue, we propose central spectral self-distillation (CSSD), a framework that isolates pure spectral information at the patch center and leverages it for SD. CSSD consists of three main components. First, the backbone network separates spectral and spatial feature processing to extract pure central spectral features. Second, a spectral refiner module enhances these spectral features before integrating spatial context. Finally, an SD loss aligns the final predictions with the central spectral guidance, ensuring granularity matching at the pixel level. The experimental results on five hyperspectral datasets demonstrate the effectiveness of CSSD under few-shot conditions. The source code will be available online at https://github.com/ZhaohuiXue/CSSD.
期刊介绍:
IEEE Transactions on Geoscience and Remote Sensing (TGRS) is a monthly publication that focuses on the theory, concepts, and techniques of science and engineering as applied to sensing the land, oceans, atmosphere, and space; and the processing, interpretation, and dissemination of this information.