{"title":"Domain-Invariant Progressive Knowledge Distillation for UAV-Based Object Detection","authors":"Liang Yao;Fan Liu;Chuanyi Zhang;Zhiquan Ou;Ting Wu","doi":"10.1109/LGRS.2024.3492187","DOIUrl":null,"url":null,"abstract":"Knowledge distillation (KD) is an effective method for compressing models in object detection tasks. Due to limited computational capability, unmanned aerial vehicle-based object detection (UAV-OD) widely adopt the KD technique to obtain lightweight detectors. Existing methods often overlook the significant differences in feature space caused by the large gap in scale between the teacher and student models. This limitation hampers the efficiency of knowledge transfer during the distillation process. Furthermore, the complex backgrounds in aerial images make it challenging for the student model to efficiently learn the object features. In this letter, we propose a novel KD framework for UAV-OD. Specifically, a progressive distillation approach is designed to alleviate the feature gap between teacher and student models. Then, a new feature alignment method is provided to extract object-related features for enhancing the student model’s knowledge reception efficiency. Finally, extensive experiments are conducted to validate the effectiveness of our proposed approach. The results demonstrate that our proposed method achieves state-of-the-art performance on two datasets.","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10745610/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Knowledge distillation (KD) is an effective method for compressing models in object detection tasks. Due to limited computational capability, unmanned aerial vehicle-based object detection (UAV-OD) widely adopt the KD technique to obtain lightweight detectors. Existing methods often overlook the significant differences in feature space caused by the large gap in scale between the teacher and student models. This limitation hampers the efficiency of knowledge transfer during the distillation process. Furthermore, the complex backgrounds in aerial images make it challenging for the student model to efficiently learn the object features. In this letter, we propose a novel KD framework for UAV-OD. Specifically, a progressive distillation approach is designed to alleviate the feature gap between teacher and student models. Then, a new feature alignment method is provided to extract object-related features for enhancing the student model’s knowledge reception efficiency. Finally, extensive experiments are conducted to validate the effectiveness of our proposed approach. The results demonstrate that our proposed method achieves state-of-the-art performance on two datasets.