{"title":"Few-Shot Object Detection Based on Adaptive Attention Mechanism and Large-Margin Softmax","authors":"Rong Huang, Runchao Lin, Aihua Dong, Zhijie Wang","doi":"10.1177/24723444221136626","DOIUrl":null,"url":null,"abstract":"Recently, a DCNet consisting of a dense relation distillation module and a context-aware aggregation module has achieved remarkable performance for the few-shot object detection task. In this article, we aim to improve the DCNet from the following two aspects. First, we design an adaptive attention module, which is equipped in the front of the dense relation distillation module, and can be trained together with the remainder parts of the DCNet. After training, the adaptive attention module helps to enhance foreground features and to suppress the background features. Second, we introduce a large-margin Softmax into the dense relation distillation module. The large-margin Softmax with a hyperparameter can normalize features without reducing the discriminability between different classes. We conduct extensive experiments on the PASCAL visual object classes and the Microsoft common objects in context data sets. The experimental results show that the proposed method can work under the few-shot scenario and achieves the mean average precision of 50.8% on the PASCAL visual object classes data set and 13.1% on the Microsoft common objects in context data set, which both outperform the existing baselines. Moreover, ablation studies and visualizations validate the usefulness of the adaptive attention module and the large-margin Softmax. The proposed method can be applied to recognize rare patterns in fabric images or detect clothes with new styles in natural scene images.","PeriodicalId":6955,"journal":{"name":"AATCC Journal of Research","volume":"1 1","pages":""},"PeriodicalIF":0.6000,"publicationDate":"2022-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AATCC Journal of Research","FirstCategoryId":"88","ListUrlMain":"https://doi.org/10.1177/24723444221136626","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"MATERIALS SCIENCE, TEXTILES","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, a DCNet consisting of a dense relation distillation module and a context-aware aggregation module has achieved remarkable performance for the few-shot object detection task. In this article, we aim to improve the DCNet from the following two aspects. First, we design an adaptive attention module, which is equipped in the front of the dense relation distillation module, and can be trained together with the remainder parts of the DCNet. After training, the adaptive attention module helps to enhance foreground features and to suppress the background features. Second, we introduce a large-margin Softmax into the dense relation distillation module. The large-margin Softmax with a hyperparameter can normalize features without reducing the discriminability between different classes. We conduct extensive experiments on the PASCAL visual object classes and the Microsoft common objects in context data sets. The experimental results show that the proposed method can work under the few-shot scenario and achieves the mean average precision of 50.8% on the PASCAL visual object classes data set and 13.1% on the Microsoft common objects in context data set, which both outperform the existing baselines. Moreover, ablation studies and visualizations validate the usefulness of the adaptive attention module and the large-margin Softmax. The proposed method can be applied to recognize rare patterns in fabric images or detect clothes with new styles in natural scene images.
期刊介绍:
AATCC Journal of Research. This textile research journal has a broad scope: from advanced materials, fibers, and textile and polymer chemistry, to color science, apparel design, and sustainability.
Now indexed by Science Citation Index Extended (SCIE) and discoverable in the Clarivate Analytics Web of Science Core Collection! The Journal’s impact factor is available in Journal Citation Reports.