Kangning Yin, Zhen Ding, Zhihua Dong, Xinhui Ji, Zhipei Wang, Dongsheng Chen, Ye Li, Guangqiang Yin, Zhiguo Wang
{"title":"Person re-identification method based on fine-grained feature fusion and self-attention mechanism","authors":"Kangning Yin, Zhen Ding, Zhihua Dong, Xinhui Ji, Zhipei Wang, Dongsheng Chen, Ye Li, Guangqiang Yin, Zhiguo Wang","doi":"10.1007/s00607-024-01270-5","DOIUrl":null,"url":null,"abstract":"<p>Aiming at the problem of low accuracy of person re-identification (Re-ID) algorithm caused by occlusion, low distinctiveness of person features and unclear detail features in complex environment, we propose a Re-ID method based on fine-grained feature fusion and self-attention mechanism. First, we design a dilated non-local module (DNLM), which combines dilated convolution with the non-local module and embeds it between layers of the backbone network, enhancing the self-attention and receptive field of the model and improving the performance on occlusion tasks. Second, the fine-grained feature fusion screening module (3FSM) is improved based on the outlook attention module, which can realize adaptive feature selection and enhance the recognition ability to similar samples of the model. Finally, combined with the feature pyramid in the field of object detection, we propose a multi-scale feature fusion pyramid (MFFP) to improve the Re-ID tasks, in which we use different levels of features to perform feature enhancement. Ablation and comprehensive experiment results based on multiple datasets validate the effectiveness of our proposal. The mean Average Precision (mAP) of Market1501 and DukeMTMC-reID is 92.5 and 87.7%, and Rank-1 is 95.1 and 91.1% respectively. Compared with the current mainstream Re-ID algorithm, our method has excellent Re-ID performance.</p>","PeriodicalId":10718,"journal":{"name":"Computing","volume":"30 1","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2024-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computing","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s00607-024-01270-5","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Aiming at the problem of low accuracy of person re-identification (Re-ID) algorithm caused by occlusion, low distinctiveness of person features and unclear detail features in complex environment, we propose a Re-ID method based on fine-grained feature fusion and self-attention mechanism. First, we design a dilated non-local module (DNLM), which combines dilated convolution with the non-local module and embeds it between layers of the backbone network, enhancing the self-attention and receptive field of the model and improving the performance on occlusion tasks. Second, the fine-grained feature fusion screening module (3FSM) is improved based on the outlook attention module, which can realize adaptive feature selection and enhance the recognition ability to similar samples of the model. Finally, combined with the feature pyramid in the field of object detection, we propose a multi-scale feature fusion pyramid (MFFP) to improve the Re-ID tasks, in which we use different levels of features to perform feature enhancement. Ablation and comprehensive experiment results based on multiple datasets validate the effectiveness of our proposal. The mean Average Precision (mAP) of Market1501 and DukeMTMC-reID is 92.5 and 87.7%, and Rank-1 is 95.1 and 91.1% respectively. Compared with the current mainstream Re-ID algorithm, our method has excellent Re-ID performance.
期刊介绍:
Computing publishes original papers, short communications and surveys on all fields of computing. The contributions should be written in English and may be of theoretical or applied nature, the essential criteria are computational relevance and systematic foundation of results.