Sparse-attention augmented domain adaptation for unsupervised person re-identification

IF 3.9 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Wei Zhang , Peijun Ye , Tao Su , Dihu Chen
{"title":"Sparse-attention augmented domain adaptation for unsupervised person re-identification","authors":"Wei Zhang ,&nbsp;Peijun Ye ,&nbsp;Tao Su ,&nbsp;Dihu Chen","doi":"10.1016/j.patrec.2024.11.013","DOIUrl":null,"url":null,"abstract":"<div><div>The domain gap persists as a demanding problem for unsupervised domain adaptive person re-identification (UDA re-ID). In response to this question, we present a novel Sparse self-Attention Augmented Domain Adaptation approach (SAADA Model) to promote network performance. In this work, we put forward a composite computational primitive (SAAP). The SAAP leverages sparse self-attention and convolution to enhance domain adaptation at the primitive level. Using SAAP as a core component, we construct an augmented bottleneck block to improve domain adaptation at the bottleneck block level. Finally, the augmented bottleneck block for domain adaptation can be cascaded into the SAADA module. After extensive experiments for UDA re-ID benchmarks, we deploy the SAADA module one time after the stage corresponding to the minimum feature map, and the performance of this method exceeds some SOTA methods. For example, the mAP has increased by 5.1% from the Market-1501 to the difficult MSMT17.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"187 ","pages":"Pages 8-13"},"PeriodicalIF":3.9000,"publicationDate":"2024-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167865524003222","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

The domain gap persists as a demanding problem for unsupervised domain adaptive person re-identification (UDA re-ID). In response to this question, we present a novel Sparse self-Attention Augmented Domain Adaptation approach (SAADA Model) to promote network performance. In this work, we put forward a composite computational primitive (SAAP). The SAAP leverages sparse self-attention and convolution to enhance domain adaptation at the primitive level. Using SAAP as a core component, we construct an augmented bottleneck block to improve domain adaptation at the bottleneck block level. Finally, the augmented bottleneck block for domain adaptation can be cascaded into the SAADA module. After extensive experiments for UDA re-ID benchmarks, we deploy the SAADA module one time after the stage corresponding to the minimum feature map, and the performance of this method exceeds some SOTA methods. For example, the mAP has increased by 5.1% from the Market-1501 to the difficult MSMT17.
用于无监督人员再识别的稀疏注意力增强领域适应性
领域差距一直是无监督领域自适应人员再识别(UDA re-ID)的难题。针对这一问题,我们提出了一种新颖的稀疏自注意力增强域自适应方法(SAADA 模型),以提高网络性能。在这项工作中,我们提出了一种复合计算基元(SAAP)。SAAP 利用稀疏自注意力和卷积来增强基元级的域适应性。以 SAAP 为核心组件,我们构建了一个增强瓶颈块,以提高瓶颈块层面的域适应性。最后,用于域适应的增强瓶颈块可以级联到 SAADA 模块中。经过对 UDA re-ID 基准的大量实验,我们在最小特征图对应的阶段之后部署了一次 SAADA 模块,该方法的性能超过了一些 SOTA 方法。例如,从 Market-1501 到困难的 MSMT17,mAP 提高了 5.1%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Pattern Recognition Letters
Pattern Recognition Letters 工程技术-计算机:人工智能
CiteScore
12.40
自引率
5.90%
发文量
287
审稿时长
9.1 months
期刊介绍: Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition. Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信