GLC++: Source-Free Universal Domain Adaptation Through Global-Local Clustering and Contrastive Affinity Learning

IF 18.6
Sanqing Qu;Tianpei Zou;Florian Röhrbein;Cewu Lu;Guang Chen;Dacheng Tao;Changjun Jiang
{"title":"GLC++: Source-Free Universal Domain Adaptation Through Global-Local Clustering and Contrastive Affinity Learning","authors":"Sanqing Qu;Tianpei Zou;Florian Röhrbein;Cewu Lu;Guang Chen;Dacheng Tao;Changjun Jiang","doi":"10.1109/TPAMI.2025.3593669","DOIUrl":null,"url":null,"abstract":"Deep neural networks often exhibit sub-optimal performance under covariate and category shifts. Source-Free Domain Adaptation (SFDA) presents a promising solution to this dilemma, yet most SFDA approaches are restricted to closed-set scenarios. In this paper, we explore Source-Free Universal Domain Adaptation (SF-UniDA) aiming to accurately classify “known” data belonging to common categories and segregate them from target-private “unknown” data. We propose a novel Global and Local Clustering (GLC) technique, which comprises an adaptive one-vs-all global clustering algorithm to discern between target classes, complemented by a local k-NN clustering strategy to mitigate negative transfer. Despite the effectiveness, the inherent closed-set source architecture leads to uniform treatment of “unknown” data, impeding the identification of distinct “unknown” categories. To address this, we evolve GLC to GLC++, integrating a contrastive affinity learning strategy. We examine the superiority of GLC and GLC++ across multiple benchmarks and category shift scenarios. Remarkably, in the most challenging open-partial-set scenarios, GLC and GLC++ surpass GATE by 16.8% and 18.9% in H-score on VisDA, respectively. GLC++ enhances the novel category clustering accuracy of GLC by 4.1% in open-set scenarios on Office-Home. Furthermore, the introduced contrastive learning strategy not only enhances GLC but also significantly facilitates existing methodologies.","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":"47 11","pages":"10646-10663"},"PeriodicalIF":18.6000,"publicationDate":"2025-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11123595/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep neural networks often exhibit sub-optimal performance under covariate and category shifts. Source-Free Domain Adaptation (SFDA) presents a promising solution to this dilemma, yet most SFDA approaches are restricted to closed-set scenarios. In this paper, we explore Source-Free Universal Domain Adaptation (SF-UniDA) aiming to accurately classify “known” data belonging to common categories and segregate them from target-private “unknown” data. We propose a novel Global and Local Clustering (GLC) technique, which comprises an adaptive one-vs-all global clustering algorithm to discern between target classes, complemented by a local k-NN clustering strategy to mitigate negative transfer. Despite the effectiveness, the inherent closed-set source architecture leads to uniform treatment of “unknown” data, impeding the identification of distinct “unknown” categories. To address this, we evolve GLC to GLC++, integrating a contrastive affinity learning strategy. We examine the superiority of GLC and GLC++ across multiple benchmarks and category shift scenarios. Remarkably, in the most challenging open-partial-set scenarios, GLC and GLC++ surpass GATE by 16.8% and 18.9% in H-score on VisDA, respectively. GLC++ enhances the novel category clustering accuracy of GLC by 4.1% in open-set scenarios on Office-Home. Furthermore, the introduced contrastive learning strategy not only enhances GLC but also significantly facilitates existing methodologies.
基于全局-局部聚类和对比亲和学习的无源通用域自适应。
深度神经网络在协变量和类别转换下经常表现出次优性能。无源域自适应(Source-Free Domain Adaptation, SFDA)为解决这一难题提供了一个很有希望的解决方案,但大多数SFDA方法都局限于闭集场景。在本文中,我们探索了无源通用域自适应(SF-UniDA),旨在准确分类属于共同类别的“已知”数据,并将它们从目标私有的“未知”数据中分离出来。我们提出了一种新的全局和局部聚类(GLC)技术,它包括一个自适应的单对全全局聚类算法来区分目标类,辅以局部k-NN聚类策略来减轻负迁移。尽管有效,但固有的闭集源架构导致对“未知”数据的统一处理,阻碍了对不同“未知”类别的识别。为了解决这个问题,我们将GLC进化为glc++,并整合了一种对比亲和学习策略。我们研究了GLC和glc++在多个基准测试和类别转换场景中的优势。值得注意的是,在最具挑战性的开放部分设置场景中,GLC和GLC++在VisDA上的h分数分别超过GATE 16.8%和18.9%。在Office-Home的开放集场景下,GLC++将GLC的新类别聚类准确率提高了4.1%。此外,引入的对比学习策略不仅增强了GLC,而且显著地促进了现有方法的发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信