{"title":"An Efficient Fine-Grained Recognition Method Enhanced by Res2Net Based on Dynamic Sparse Attention.","authors":"Qifeng Niu, Hui Wang, Feng Xu","doi":"10.3390/s25134147","DOIUrl":null,"url":null,"abstract":"<p><p>Fine-grained recognition tasks face significant challenges in differentiating subtle, class-specific details against cluttered backgrounds. This paper presents an efficient architecture built upon the Res2Net backbone, significantly enhanced by a dynamic Sparse Attention mechanism. The core approach leverages the inherent multi-scale representation power of Res2Net to capture discriminative patterns across different granularities. Crucially, the integrated Sparse Attention module operates dynamically, selectively amplifying the most informative features while attenuating irrelevant background noise and redundant details. This combined strategy substantially improves the model's ability to focus on pivotal regions critical for accurate classification. Furthermore, strategic architectural optimizations are applied throughout to minimize computational complexity, resulting in a model that demands significantly fewer parameters and exhibits faster inference times. Extensive evaluations on benchmark datasets demonstrate the effectiveness of the proposed method. It achieves a modest but consistent accuracy gain over strong baselines (approximately 2%) while simultaneously reducing model size by around 30% and inference latency by about 20%, proving highly effective for practical fine-grained recognition applications requiring both high accuracy and operational efficiency.</p>","PeriodicalId":21698,"journal":{"name":"Sensors","volume":"25 13","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12252481/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.3390/s25134147","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Fine-grained recognition tasks face significant challenges in differentiating subtle, class-specific details against cluttered backgrounds. This paper presents an efficient architecture built upon the Res2Net backbone, significantly enhanced by a dynamic Sparse Attention mechanism. The core approach leverages the inherent multi-scale representation power of Res2Net to capture discriminative patterns across different granularities. Crucially, the integrated Sparse Attention module operates dynamically, selectively amplifying the most informative features while attenuating irrelevant background noise and redundant details. This combined strategy substantially improves the model's ability to focus on pivotal regions critical for accurate classification. Furthermore, strategic architectural optimizations are applied throughout to minimize computational complexity, resulting in a model that demands significantly fewer parameters and exhibits faster inference times. Extensive evaluations on benchmark datasets demonstrate the effectiveness of the proposed method. It achieves a modest but consistent accuracy gain over strong baselines (approximately 2%) while simultaneously reducing model size by around 30% and inference latency by about 20%, proving highly effective for practical fine-grained recognition applications requiring both high accuracy and operational efficiency.
期刊介绍:
Sensors (ISSN 1424-8220) provides an advanced forum for the science and technology of sensors and biosensors. It publishes reviews (including comprehensive reviews on the complete sensors products), regular research papers and short notes. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.