CMKD-Net: a cross-modal knowledge distillation method for remote sensing image classification

IF 2.8 3区 地球科学 Q2 ASTRONOMY & ASTROPHYSICS
Huaxiang Song , Junping Xie , Yingying Duan , Xinyi Xie , Yang Zhou , Wenhui Wang
{"title":"CMKD-Net: a cross-modal knowledge distillation method for remote sensing image classification","authors":"Huaxiang Song ,&nbsp;Junping Xie ,&nbsp;Yingying Duan ,&nbsp;Xinyi Xie ,&nbsp;Yang Zhou ,&nbsp;Wenhui Wang","doi":"10.1016/j.asr.2025.04.009","DOIUrl":null,"url":null,"abstract":"<div><div>Cross-modal knowledge distillation (KD) offers the potential to synergize the strengths of Vision Transformers (ViTs) and Convolutional Neural Networks (CNNs) in remote sensing image (RSI) classification. However, existing KD techniques in this field are frequently ineffective and time-consuming. We contend that this inefficiency stems from the data sparsity inherent in RSI samples—a challenge long overlooked in previous studies. To address this issue, we propose a novel algorithm designed to alleviate data sparsity and enhance the quality of the training data. Building upon this, we introduce CMKD-Net, a KD framework that facilitates knowledge transfer from a ViT teacher to a CNN student model. Experimental evaluations on three RSI datasets demonstrate that CMKD-Net outperforms 17 state-of-the-art models published since 2022 on classification accuracy and model compactness. Furthermore, our method cuts down training time by at least 83% compared to current KD methods, making cross-modal KD for RSI classification much more effective.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"75 12","pages":"Pages 8515-8534"},"PeriodicalIF":2.8000,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117725003333","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Cross-modal knowledge distillation (KD) offers the potential to synergize the strengths of Vision Transformers (ViTs) and Convolutional Neural Networks (CNNs) in remote sensing image (RSI) classification. However, existing KD techniques in this field are frequently ineffective and time-consuming. We contend that this inefficiency stems from the data sparsity inherent in RSI samples—a challenge long overlooked in previous studies. To address this issue, we propose a novel algorithm designed to alleviate data sparsity and enhance the quality of the training data. Building upon this, we introduce CMKD-Net, a KD framework that facilitates knowledge transfer from a ViT teacher to a CNN student model. Experimental evaluations on three RSI datasets demonstrate that CMKD-Net outperforms 17 state-of-the-art models published since 2022 on classification accuracy and model compactness. Furthermore, our method cuts down training time by at least 83% compared to current KD methods, making cross-modal KD for RSI classification much more effective.
CMKD-Net:一种用于遥感图像分类的跨模态知识蒸馏方法
跨模态知识蒸馏(KD)提供了在遥感图像分类中协同视觉变换(ViTs)和卷积神经网络(cnn)优势的潜力。然而,现有的KD技术在这一领域往往是无效和耗时的。我们认为,这种低效率源于RSI样本固有的数据稀疏性,这是以往研究长期忽视的一个挑战。为了解决这个问题,我们提出了一种新的算法,旨在减轻数据稀疏性,提高训练数据的质量。在此基础上,我们介绍了CMKD-Net,这是一个KD框架,可以促进从ViT教师到CNN学生模型的知识转移。对三个RSI数据集的实验评估表明,CMKD-Net在分类精度和模型紧凑性方面优于自2022年以来发表的17个最先进的模型。此外,与目前的KD方法相比,我们的方法减少了至少83%的训练时间,使交叉模态KD用于RSI分类更加有效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Advances in Space Research
Advances in Space Research 地学天文-地球科学综合
CiteScore
5.20
自引率
11.50%
发文量
800
审稿时长
5.8 months
期刊介绍: The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc. NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR). All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信