面向遥感细粒度目标识别的有效知识精馏

IF 4.4
Yangte Gao;Chenwei Deng;Liang Chen
{"title":"面向遥感细粒度目标识别的有效知识精馏","authors":"Yangte Gao;Chenwei Deng;Liang Chen","doi":"10.1109/LGRS.2025.3591045","DOIUrl":null,"url":null,"abstract":"With advancements in on-board computing devices deployed on remote sensing platforms, the demand for efficiently processing remote sensing imagery has become increasingly prominent. Knowledge distillation, as an effective lightweight method, has been introduced into this domain. Intuitively, distillation from a larger teacher model is expected to yield better performance. However, in our investigation of fine-grained object recognition in remote sensing imagery, we observed a counter-intuitive phenomenon: as the size of the teacher model increases, the performance of the student model initially improves but then degrades. This capacity gap issue hinders effective utilization of stronger teacher models. To address this issue, we propose a novel distillation framework named BL-KD. It integrates two tailored components: the class-level learnable orthogonal projection (CLOP) module and the object rebalance (ORB) module, which are jointly optimized to mitigate the negative impact of the capacity gap while effectively adapting to the unique distributional patterns and challenges inherent in remote sensing imagery. Experiments conducted on multiple fine-grained object recognition tasks in remote sensing demonstrate that our method consistently improves student performance, particularly in scenarios involving large teacher–student gaps, and outperforms several widely used distillation baselines.","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":4.4000,"publicationDate":"2025-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Toward Effective Knowledge Distillation for Fine-Grained Object Recognition in Remote Sensing\",\"authors\":\"Yangte Gao;Chenwei Deng;Liang Chen\",\"doi\":\"10.1109/LGRS.2025.3591045\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With advancements in on-board computing devices deployed on remote sensing platforms, the demand for efficiently processing remote sensing imagery has become increasingly prominent. Knowledge distillation, as an effective lightweight method, has been introduced into this domain. Intuitively, distillation from a larger teacher model is expected to yield better performance. However, in our investigation of fine-grained object recognition in remote sensing imagery, we observed a counter-intuitive phenomenon: as the size of the teacher model increases, the performance of the student model initially improves but then degrades. This capacity gap issue hinders effective utilization of stronger teacher models. To address this issue, we propose a novel distillation framework named BL-KD. It integrates two tailored components: the class-level learnable orthogonal projection (CLOP) module and the object rebalance (ORB) module, which are jointly optimized to mitigate the negative impact of the capacity gap while effectively adapting to the unique distributional patterns and challenges inherent in remote sensing imagery. Experiments conducted on multiple fine-grained object recognition tasks in remote sensing demonstrate that our method consistently improves student performance, particularly in scenarios involving large teacher–student gaps, and outperforms several widely used distillation baselines.\",\"PeriodicalId\":91017,\"journal\":{\"name\":\"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society\",\"volume\":\"22 \",\"pages\":\"1-5\"},\"PeriodicalIF\":4.4000,\"publicationDate\":\"2025-07-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11086626/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11086626/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

随着遥感平台上部署的车载计算设备的发展,对遥感图像进行高效处理的需求日益突出。知识蒸馏作为一种有效的轻量级方法被引入到这一领域。直观地说,从一个更大的教师模型中提炼出来的东西有望产生更好的性能。然而,在我们对遥感图像中细粒度物体识别的研究中,我们观察到一个反直觉的现象:随着教师模型的大小增加,学生模型的性能最初提高,但随后下降。这种能力差距问题阻碍了更强的教师模式的有效利用。为了解决这个问题,我们提出了一个新的蒸馏框架,命名为BL-KD。它集成了两个定制组件:类级可学习正交投影(CLOP)模块和目标再平衡(ORB)模块,这两个组件进行了联合优化,以减轻能力差距的负面影响,同时有效地适应遥感图像独特的分布模式和固有的挑战。在遥感中进行的多个细粒度目标识别任务的实验表明,我们的方法可以持续提高学生的表现,特别是在涉及较大师生差距的场景中,并且优于几种广泛使用的蒸馏基线。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Toward Effective Knowledge Distillation for Fine-Grained Object Recognition in Remote Sensing
With advancements in on-board computing devices deployed on remote sensing platforms, the demand for efficiently processing remote sensing imagery has become increasingly prominent. Knowledge distillation, as an effective lightweight method, has been introduced into this domain. Intuitively, distillation from a larger teacher model is expected to yield better performance. However, in our investigation of fine-grained object recognition in remote sensing imagery, we observed a counter-intuitive phenomenon: as the size of the teacher model increases, the performance of the student model initially improves but then degrades. This capacity gap issue hinders effective utilization of stronger teacher models. To address this issue, we propose a novel distillation framework named BL-KD. It integrates two tailored components: the class-level learnable orthogonal projection (CLOP) module and the object rebalance (ORB) module, which are jointly optimized to mitigate the negative impact of the capacity gap while effectively adapting to the unique distributional patterns and challenges inherent in remote sensing imagery. Experiments conducted on multiple fine-grained object recognition tasks in remote sensing demonstrate that our method consistently improves student performance, particularly in scenarios involving large teacher–student gaps, and outperforms several widely used distillation baselines.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信