Unsupervised Cross-domain Person Re-Identification based on Asymmetrical Pyramid Non-local Block

Xiangyu Li, Yuhang Zheng, Shangmin Zhou
{"title":"Unsupervised Cross-domain Person Re-Identification based on Asymmetrical Pyramid Non-local Block","authors":"Xiangyu Li, Yuhang Zheng, Shangmin Zhou","doi":"10.1145/3561613.3561636","DOIUrl":null,"url":null,"abstract":"The purpose of unsupervised cross-domain (UCD) person re-identification (re-ID) is to adapt the well pre-trained model on the labeled source domain to the unlabeled target domain, which tackles a more realistic problem. However, the network in the existing model cannot fully extract the features of pedestrians, so the results after clustering are not satisfactory. To address this problem, a feature extraction network model with a self-attention mechanism is proposed in this paper in order to improve the feature expression ability. We try to design and optimize the attention mechanism-based feature extraction network and similarity loss function for unsupervised person re-ID to improve the recognition accuracy. On the basis of the baseline network (such as ResNet-50), the self-attention mechanism-asymmetrical pyramid non-local block (APNB) is added to help the network learn richer global feature representation. Besides, the similarity loss function using the Euclidean distance is designed, which shows better performance than the cosine distance. Experimental results show that the proposed method has competitive performance on two public datasets Markket-1501 and DukeMTMC-Re-ID.","PeriodicalId":348024,"journal":{"name":"Proceedings of the 5th International Conference on Control and Computer Vision","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th International Conference on Control and Computer Vision","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3561613.3561636","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The purpose of unsupervised cross-domain (UCD) person re-identification (re-ID) is to adapt the well pre-trained model on the labeled source domain to the unlabeled target domain, which tackles a more realistic problem. However, the network in the existing model cannot fully extract the features of pedestrians, so the results after clustering are not satisfactory. To address this problem, a feature extraction network model with a self-attention mechanism is proposed in this paper in order to improve the feature expression ability. We try to design and optimize the attention mechanism-based feature extraction network and similarity loss function for unsupervised person re-ID to improve the recognition accuracy. On the basis of the baseline network (such as ResNet-50), the self-attention mechanism-asymmetrical pyramid non-local block (APNB) is added to help the network learn richer global feature representation. Besides, the similarity loss function using the Euclidean distance is designed, which shows better performance than the cosine distance. Experimental results show that the proposed method has competitive performance on two public datasets Markket-1501 and DukeMTMC-Re-ID.
基于非对称金字塔非局部块的无监督跨域人物再识别
无监督跨域(UCD)人员再识别(re-ID)的目的是将标记的源域上预训练好的模型适应于未标记的目标域,从而解决了一个更现实的问题。然而,现有模型中的网络不能完全提取行人的特征,因此聚类后的结果并不令人满意。为了解决这一问题,本文提出了一种带有自关注机制的特征提取网络模型,以提高特征表达能力。设计并优化了基于注意力机制的无监督人再识别特征提取网络和相似度损失函数,以提高识别精度。在基线网络(如ResNet-50)的基础上,加入自注意机制-不对称金字塔非局部块(APNB),帮助网络学习更丰富的全局特征表示。此外,设计了基于欧氏距离的相似度损失函数,其性能优于基于余弦距离的相似度损失函数。实验结果表明,该方法在两个公共数据集market -1501和DukeMTMC-Re-ID上具有较好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信