Huaxiang Song , Junping Xie , Yingying Duan , Xinyi Xie , Yang Zhou , Wenhui Wang
{"title":"CMKD-Net: a cross-modal knowledge distillation method for remote sensing image classification","authors":"Huaxiang Song , Junping Xie , Yingying Duan , Xinyi Xie , Yang Zhou , Wenhui Wang","doi":"10.1016/j.asr.2025.04.009","DOIUrl":null,"url":null,"abstract":"<div><div>Cross-modal knowledge distillation (KD) offers the potential to synergize the strengths of Vision Transformers (ViTs) and Convolutional Neural Networks (CNNs) in remote sensing image (RSI) classification. However, existing KD techniques in this field are frequently ineffective and time-consuming. We contend that this inefficiency stems from the data sparsity inherent in RSI samples—a challenge long overlooked in previous studies. To address this issue, we propose a novel algorithm designed to alleviate data sparsity and enhance the quality of the training data. Building upon this, we introduce CMKD-Net, a KD framework that facilitates knowledge transfer from a ViT teacher to a CNN student model. Experimental evaluations on three RSI datasets demonstrate that CMKD-Net outperforms 17 state-of-the-art models published since 2022 on classification accuracy and model compactness. Furthermore, our method cuts down training time by at least 83% compared to current KD methods, making cross-modal KD for RSI classification much more effective.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"75 12","pages":"Pages 8515-8534"},"PeriodicalIF":2.8000,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117725003333","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
Cross-modal knowledge distillation (KD) offers the potential to synergize the strengths of Vision Transformers (ViTs) and Convolutional Neural Networks (CNNs) in remote sensing image (RSI) classification. However, existing KD techniques in this field are frequently ineffective and time-consuming. We contend that this inefficiency stems from the data sparsity inherent in RSI samples—a challenge long overlooked in previous studies. To address this issue, we propose a novel algorithm designed to alleviate data sparsity and enhance the quality of the training data. Building upon this, we introduce CMKD-Net, a KD framework that facilitates knowledge transfer from a ViT teacher to a CNN student model. Experimental evaluations on three RSI datasets demonstrate that CMKD-Net outperforms 17 state-of-the-art models published since 2022 on classification accuracy and model compactness. Furthermore, our method cuts down training time by at least 83% compared to current KD methods, making cross-modal KD for RSI classification much more effective.
期刊介绍:
The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc.
NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR).
All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.