通过学习异构伪标签进行跨域人员再识别

IF 7.5 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Zhong Zhang, Di He, Shuang Liu
{"title":"通过学习异构伪标签进行跨域人员再识别","authors":"Zhong Zhang,&nbsp;Di He,&nbsp;Shuang Liu","doi":"10.1016/j.patcog.2025.111702","DOIUrl":null,"url":null,"abstract":"<div><div>Assigning pseudo labels is vital for cross-domain person re-identification (ReID), and most existing methods only assign one kind of pseudo labels to unlabeled target domain samples, which cannot describe these unlabeled samples accurately due to large intra-class and small inter-class variances caused by diverse environmental factors, such as occlusions, illuminations, viewpoints, and poses, etc. In this paper, we propose a novel label learning method named Heterogeneous Pseudo Labels (HPL) for cross-domain person ReID, which could overcome large intra-class and small inter-class variances between pedestrian images in the target domain. For each unlabeled target domain sample, HPL simultaneously learns three different kinds of pseudo labels, i.e., fine-grained labels, coarse-grained labels, and instance labels. With the three kinds of labels, we could make full use of their own advantages to describe target domain samples from different perspectives. Meanwhile, we propose the Pseudo Labels Constraint (PLC) to improve the quality of the heterogeneous labels by using their consistency. Furthermore, in order to relieve the influence of noisy labels from the aspect of contrastive learning, we propose the Confidence Contrastive Loss (CCL) to consider the sample confidence in the learning process. Extensive experiments on four cross-domain tasks demonstrate that the proposed method achieves a new state-of-the-art performance, for example, the proposed method achieves 87.2% mAP and 95.0% Rank-1 accuracy on MSMT17<span><math><mo>→</mo></math></span>Market.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"166 ","pages":"Article 111702"},"PeriodicalIF":7.5000,"publicationDate":"2025-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Cross-domain person re-identification via learning Heterogeneous Pseudo Labels\",\"authors\":\"Zhong Zhang,&nbsp;Di He,&nbsp;Shuang Liu\",\"doi\":\"10.1016/j.patcog.2025.111702\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Assigning pseudo labels is vital for cross-domain person re-identification (ReID), and most existing methods only assign one kind of pseudo labels to unlabeled target domain samples, which cannot describe these unlabeled samples accurately due to large intra-class and small inter-class variances caused by diverse environmental factors, such as occlusions, illuminations, viewpoints, and poses, etc. In this paper, we propose a novel label learning method named Heterogeneous Pseudo Labels (HPL) for cross-domain person ReID, which could overcome large intra-class and small inter-class variances between pedestrian images in the target domain. For each unlabeled target domain sample, HPL simultaneously learns three different kinds of pseudo labels, i.e., fine-grained labels, coarse-grained labels, and instance labels. With the three kinds of labels, we could make full use of their own advantages to describe target domain samples from different perspectives. Meanwhile, we propose the Pseudo Labels Constraint (PLC) to improve the quality of the heterogeneous labels by using their consistency. Furthermore, in order to relieve the influence of noisy labels from the aspect of contrastive learning, we propose the Confidence Contrastive Loss (CCL) to consider the sample confidence in the learning process. Extensive experiments on four cross-domain tasks demonstrate that the proposed method achieves a new state-of-the-art performance, for example, the proposed method achieves 87.2% mAP and 95.0% Rank-1 accuracy on MSMT17<span><math><mo>→</mo></math></span>Market.</div></div>\",\"PeriodicalId\":49713,\"journal\":{\"name\":\"Pattern Recognition\",\"volume\":\"166 \",\"pages\":\"Article 111702\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-04-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Pattern Recognition\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0031320325003620\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320325003620","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

伪标签的分配是跨域人再识别(ReID)的关键,现有的大多数方法只对未标记的目标域样本分配一种伪标签,由于不同环境因素(如遮挡、光照、视点、姿态等)造成类内差异大、类间差异小,无法准确描述这些未标记的样本。本文提出了一种新的标签学习方法——异构伪标签(Heterogeneous Pseudo Labels, HPL),该方法可以克服目标域行人图像类内差异大、类间差异小的问题。对于每个未标记的目标域样本,HPL同时学习三种不同的伪标签,即细粒度标签、粗粒度标签和实例标签。有了这三种标签,我们可以充分利用它们各自的优势,从不同的角度描述目标域样本。同时,我们提出了伪标签约束(PLC),利用异构标签的一致性来提高异构标签的质量。此外,为了从对比学习的角度减轻噪声标签的影响,我们提出了在学习过程中考虑样本置信度的Confidence contrastive Loss (CCL)。在4个跨域任务上进行的大量实验表明,该方法在MSMT17→Market上实现了87.2%的mAP和95.0%的Rank-1准确率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Cross-domain person re-identification via learning Heterogeneous Pseudo Labels
Assigning pseudo labels is vital for cross-domain person re-identification (ReID), and most existing methods only assign one kind of pseudo labels to unlabeled target domain samples, which cannot describe these unlabeled samples accurately due to large intra-class and small inter-class variances caused by diverse environmental factors, such as occlusions, illuminations, viewpoints, and poses, etc. In this paper, we propose a novel label learning method named Heterogeneous Pseudo Labels (HPL) for cross-domain person ReID, which could overcome large intra-class and small inter-class variances between pedestrian images in the target domain. For each unlabeled target domain sample, HPL simultaneously learns three different kinds of pseudo labels, i.e., fine-grained labels, coarse-grained labels, and instance labels. With the three kinds of labels, we could make full use of their own advantages to describe target domain samples from different perspectives. Meanwhile, we propose the Pseudo Labels Constraint (PLC) to improve the quality of the heterogeneous labels by using their consistency. Furthermore, in order to relieve the influence of noisy labels from the aspect of contrastive learning, we propose the Confidence Contrastive Loss (CCL) to consider the sample confidence in the learning process. Extensive experiments on four cross-domain tasks demonstrate that the proposed method achieves a new state-of-the-art performance, for example, the proposed method achieves 87.2% mAP and 95.0% Rank-1 accuracy on MSMT17Market.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Pattern Recognition
Pattern Recognition 工程技术-工程:电子与电气
CiteScore
14.40
自引率
16.20%
发文量
683
审稿时长
5.6 months
期刊介绍: The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信