DF-FSOD: A Novel Approach for Few-shot Object Detection via Distinguished Features

Anh-Khoa Nguyen Vu, Thanh-Danh Nguyen, Vinh-Tiep Nguyen, T. Ngo
{"title":"DF-FSOD: A Novel Approach for Few-shot Object Detection via Distinguished Features","authors":"Anh-Khoa Nguyen Vu, Thanh-Danh Nguyen, Vinh-Tiep Nguyen, T. Ngo","doi":"10.1109/MAPR53640.2021.9585248","DOIUrl":null,"url":null,"abstract":"Few-shot object detection (FSOD) is a challenging task in which detectors are trained to recognize unseen objects with limited training data. The majority of existing methods are evaluated on the benchmarks built with a fixed quantity of base and novel classes categories. To be specific, the number of base classes is larger than the novel ones. This positively affects the performance evaluated on novel data. However, there are not many works focusing on the effect of such dominated categories on the performance of FSOD models. In this paper, we investigate the efficiency of the detectors in different ratios of base and novel categories in the novel phase. Based on our findings of the affection between base and novel classes, we present a new approach: Distinguished Features for FSOD (DF-FSOD), which encourages the detector to learn distinguished features to capture novel objects via base-class expansion better. In the end, our proposed method outperforms average 4% AP@50 on PASCAL VOC compared to the previous works on the unseen classes when extremely scare labeled data.","PeriodicalId":233540,"journal":{"name":"2021 International Conference on Multimedia Analysis and Pattern Recognition (MAPR)","volume":"130 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Multimedia Analysis and Pattern Recognition (MAPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MAPR53640.2021.9585248","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Few-shot object detection (FSOD) is a challenging task in which detectors are trained to recognize unseen objects with limited training data. The majority of existing methods are evaluated on the benchmarks built with a fixed quantity of base and novel classes categories. To be specific, the number of base classes is larger than the novel ones. This positively affects the performance evaluated on novel data. However, there are not many works focusing on the effect of such dominated categories on the performance of FSOD models. In this paper, we investigate the efficiency of the detectors in different ratios of base and novel categories in the novel phase. Based on our findings of the affection between base and novel classes, we present a new approach: Distinguished Features for FSOD (DF-FSOD), which encourages the detector to learn distinguished features to capture novel objects via base-class expansion better. In the end, our proposed method outperforms average 4% AP@50 on PASCAL VOC compared to the previous works on the unseen classes when extremely scare labeled data.
DF-FSOD:一种基于显著特征的小镜头目标检测新方法
少射目标检测(FSOD)是一项具有挑战性的任务,它训练检测器在有限的训练数据下识别看不见的目标。大多数现有方法都是在使用固定数量的基类和新类类别构建的基准上进行评估的。具体地说,基类的数量比新类要多。这对在新数据上评估的性能有积极影响。然而,关注这些主导类别对FSOD模型性能影响的作品并不多。在本文中,我们研究了不同比例的碱基和新类别检测器在新相中的效率。基于基类和新类之间的关系,我们提出了一种新的方法:区分特征的FSOD (DF-FSOD),该方法鼓励检测器通过基类扩展来学习区分特征,从而更好地捕获新对象。最后,我们提出的方法在PASCAL VOC上的平均性能优于之前在未见过的类上的平均4% AP@50。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信