High-Quality-High-Quantity Semantic Distillation for Incremental Object Detection

Mengxue Kang, Jinpeng Zhang, Xiashuang Wang, Xuhui Huang
{"title":"High-Quality-High-Quantity Semantic Distillation for Incremental Object Detection","authors":"Mengxue Kang, Jinpeng Zhang, Xiashuang Wang, Xuhui Huang","doi":"10.1145/3579654.3579682","DOIUrl":null,"url":null,"abstract":"Model is required to learn from dynamic data stream under incremental object detection task. However, traditional object detection model fails to deal with this scenario. Fine-tuning on new task suffers from a fast performance decay of early learned tasks, which is known as catastrophic forgetting. A promising way to alleviate catastrophic forgetting is knowledge distillation, which includes feature distillation and response distillation. Previous feature distillation methods have not discuss knowledge selection and knowledge transfer at the same time. In this paper, we propose high-level semantic feature distillation and task re-balancing strategy that consider both high-quality knowledge selection and high-quantity knowledge transfer simultaneously. Extensive experiments are conducted on MS COCO benchmarks. The performance of our method exceeds previous SOTA methods under all experimental scenarios. Remarkably, our method reduces the mAP gap toward full-training to 2.58, which is much better than that of the previous SOTA method with a gap of 3.30.","PeriodicalId":146783,"journal":{"name":"Proceedings of the 2022 5th International Conference on Algorithms, Computing and Artificial Intelligence","volume":"267 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th International Conference on Algorithms, Computing and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3579654.3579682","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Model is required to learn from dynamic data stream under incremental object detection task. However, traditional object detection model fails to deal with this scenario. Fine-tuning on new task suffers from a fast performance decay of early learned tasks, which is known as catastrophic forgetting. A promising way to alleviate catastrophic forgetting is knowledge distillation, which includes feature distillation and response distillation. Previous feature distillation methods have not discuss knowledge selection and knowledge transfer at the same time. In this paper, we propose high-level semantic feature distillation and task re-balancing strategy that consider both high-quality knowledge selection and high-quantity knowledge transfer simultaneously. Extensive experiments are conducted on MS COCO benchmarks. The performance of our method exceeds previous SOTA methods under all experimental scenarios. Remarkably, our method reduces the mAP gap toward full-training to 2.58, which is much better than that of the previous SOTA method with a gap of 3.30.
面向增量目标检测的高质量、高数量语义蒸馏
增量目标检测任务要求模型从动态数据流中学习。然而,传统的目标检测模型无法处理这种情况。对新任务的微调受到早期学习任务的快速性能衰减的影响,这被称为灾难性遗忘。知识蒸馏是缓解灾难性遗忘的一种很有前途的方法,它包括特征蒸馏和响应蒸馏。以往的特征蒸馏方法没有同时讨论知识选择和知识转移问题。本文提出了同时考虑高质量知识选择和大量知识转移的高级语义特征提取和任务再平衡策略。在MS COCO基准上进行了广泛的实验。在所有实验场景下,我们的方法的性能都超过了以前的SOTA方法。值得注意的是,我们的方法将完全训练的mAP差距减小到2.58,这比之前的SOTA方法的3.30差距要好得多。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信