Domain-Invariant Progressive Knowledge Distillation for UAV-Based Object Detection

Liang Yao;Fan Liu;Chuanyi Zhang;Zhiquan Ou;Ting Wu
{"title":"Domain-Invariant Progressive Knowledge Distillation for UAV-Based Object Detection","authors":"Liang Yao;Fan Liu;Chuanyi Zhang;Zhiquan Ou;Ting Wu","doi":"10.1109/LGRS.2024.3492187","DOIUrl":null,"url":null,"abstract":"Knowledge distillation (KD) is an effective method for compressing models in object detection tasks. Due to limited computational capability, unmanned aerial vehicle-based object detection (UAV-OD) widely adopt the KD technique to obtain lightweight detectors. Existing methods often overlook the significant differences in feature space caused by the large gap in scale between the teacher and student models. This limitation hampers the efficiency of knowledge transfer during the distillation process. Furthermore, the complex backgrounds in aerial images make it challenging for the student model to efficiently learn the object features. In this letter, we propose a novel KD framework for UAV-OD. Specifically, a progressive distillation approach is designed to alleviate the feature gap between teacher and student models. Then, a new feature alignment method is provided to extract object-related features for enhancing the student model’s knowledge reception efficiency. Finally, extensive experiments are conducted to validate the effectiveness of our proposed approach. The results demonstrate that our proposed method achieves state-of-the-art performance on two datasets.","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10745610/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Knowledge distillation (KD) is an effective method for compressing models in object detection tasks. Due to limited computational capability, unmanned aerial vehicle-based object detection (UAV-OD) widely adopt the KD technique to obtain lightweight detectors. Existing methods often overlook the significant differences in feature space caused by the large gap in scale between the teacher and student models. This limitation hampers the efficiency of knowledge transfer during the distillation process. Furthermore, the complex backgrounds in aerial images make it challenging for the student model to efficiently learn the object features. In this letter, we propose a novel KD framework for UAV-OD. Specifically, a progressive distillation approach is designed to alleviate the feature gap between teacher and student models. Then, a new feature alignment method is provided to extract object-related features for enhancing the student model’s knowledge reception efficiency. Finally, extensive experiments are conducted to validate the effectiveness of our proposed approach. The results demonstrate that our proposed method achieves state-of-the-art performance on two datasets.
基于无人机的物体检测的领域不变渐进式知识提炼
在物体检测任务中,知识蒸馏(KD)是一种压缩模型的有效方法。由于计算能力有限,基于无人机的物体检测(UAV-OD)广泛采用知识蒸馏技术来获得轻量级的检测器。现有的方法往往忽略了教师模型和学生模型在比例上的巨大差距所导致的特征空间的显著差异。这种限制妨碍了提炼过程中知识转移的效率。此外,航空图像中的复杂背景也给学生模型有效学习物体特征带来了挑战。在这封信中,我们提出了一种适用于无人机光学观测的新型 KD 框架。具体来说,我们设计了一种渐进式蒸馏方法来缓解教师模型和学生模型之间的特征差距。然后,提供了一种新的特征对齐方法来提取与物体相关的特征,以提高学生模型的知识接收效率。最后,我们进行了大量实验来验证所提方法的有效性。结果表明,我们提出的方法在两个数据集上取得了最先进的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信