面向对象的对象检测关系蒸馏

Shuyu Miao, Rui Feng
{"title":"面向对象的对象检测关系蒸馏","authors":"Shuyu Miao, Rui Feng","doi":"10.1109/ICASSP39728.2021.9413925","DOIUrl":null,"url":null,"abstract":"Object detection models have achieved increasingly better performance based on more complex architecture designs, but the heavy computation limits their further widespread application on the devices with insufficient computational power. To this end, we propose a novel Object-Oriented Relational Distillation (OORD) method that drives small detection models to have an effective performance like large detection models with constant efficiency. Here, we introduce to distill relative relation knowledge from teacher/large models to student/small models, which promotes the small models to learn better soft feature representation by the guiding of large models. OORD consists of two parts, i.e., Object Extraction (OE) and Relation Distillation (RD). OE extracts foreground features to avoid background feature interference, and RD distills the relative relations between the foreground features through graph convolution. Related experiments conducted on various kinds of detection models show the effectiveness of OORD, which improves the performance of the small model by nearly 10% without additional inference time cost.","PeriodicalId":347060,"journal":{"name":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Object-Oriented Relational Distillation for Object Detection\",\"authors\":\"Shuyu Miao, Rui Feng\",\"doi\":\"10.1109/ICASSP39728.2021.9413925\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Object detection models have achieved increasingly better performance based on more complex architecture designs, but the heavy computation limits their further widespread application on the devices with insufficient computational power. To this end, we propose a novel Object-Oriented Relational Distillation (OORD) method that drives small detection models to have an effective performance like large detection models with constant efficiency. Here, we introduce to distill relative relation knowledge from teacher/large models to student/small models, which promotes the small models to learn better soft feature representation by the guiding of large models. OORD consists of two parts, i.e., Object Extraction (OE) and Relation Distillation (RD). OE extracts foreground features to avoid background feature interference, and RD distills the relative relations between the foreground features through graph convolution. Related experiments conducted on various kinds of detection models show the effectiveness of OORD, which improves the performance of the small model by nearly 10% without additional inference time cost.\",\"PeriodicalId\":347060,\"journal\":{\"name\":\"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICASSP39728.2021.9413925\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP39728.2021.9413925","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

基于更复杂的体系结构设计,目标检测模型的性能越来越好,但庞大的计算量限制了其在计算能力不足的设备上的进一步广泛应用。为此,我们提出了一种新的面向对象的关系蒸馏(OORD)方法,该方法驱动小型检测模型具有与大型检测模型相同的有效性能,并且效率恒定。本文介绍了从教师/大模型到学生/小模型的相对关系知识提取,在大模型的引导下,促进小模型更好地学习软特征表示。OORD包括两个部分,即对象提取(OE)和关系蒸馏(RD)。OE提取前景特征以避免背景特征干扰,RD通过图卷积提取前景特征之间的相对关系。在各种检测模型上进行的相关实验表明了OORD的有效性,在不增加推理时间成本的情况下,将小模型的性能提高了近10%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Object-Oriented Relational Distillation for Object Detection
Object detection models have achieved increasingly better performance based on more complex architecture designs, but the heavy computation limits their further widespread application on the devices with insufficient computational power. To this end, we propose a novel Object-Oriented Relational Distillation (OORD) method that drives small detection models to have an effective performance like large detection models with constant efficiency. Here, we introduce to distill relative relation knowledge from teacher/large models to student/small models, which promotes the small models to learn better soft feature representation by the guiding of large models. OORD consists of two parts, i.e., Object Extraction (OE) and Relation Distillation (RD). OE extracts foreground features to avoid background feature interference, and RD distills the relative relations between the foreground features through graph convolution. Related experiments conducted on various kinds of detection models show the effectiveness of OORD, which improves the performance of the small model by nearly 10% without additional inference time cost.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信