无监督域自适应的强弱综合半监督

Xiaohu Lu, H. Radha
{"title":"无监督域自适应的强弱综合半监督","authors":"Xiaohu Lu, H. Radha","doi":"10.1109/ICIP46576.2022.9897242","DOIUrl":null,"url":null,"abstract":"Unsupervised domain adaptation (UDA) focuses on transferring knowledge learned in the labeled source domain to the unlabeled target domain. Semi-supervised learning is a proven strategy for improving UDA performance. In this paper, we propose a novel strong-weak integrated semi-supervision (SWISS) learning strategy for unsupervised domain adaptation. Under the proposed SWISSUDA framework, a strong representative set with high confidence but low diversity target domain samples and a weak representative set with low confidence but high diversity target domain samples are updated constantly during the training process. Both sets are fused randomly to generate an augmented strong-weak training batch with pseudo-labels to train the network during every iteration. Moreover, a novel adversarial logit loss is proposed to reduce the intra-class divergence between source and target domains, which is back-propagated adversarially with a gradient reverse layer between the classifier and the rest of the network. Experimental results based on two popular benchmarks, Office-Home, and DomainNet, show the effectiveness of the proposed SWISS framework with our method achieving the best performance in both Office-Home and DomainNet.","PeriodicalId":387035,"journal":{"name":"2022 IEEE International Conference on Image Processing (ICIP)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Strong-Weak Integrated Semi-Supervision for Unsupervised Domain Adaptation\",\"authors\":\"Xiaohu Lu, H. Radha\",\"doi\":\"10.1109/ICIP46576.2022.9897242\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Unsupervised domain adaptation (UDA) focuses on transferring knowledge learned in the labeled source domain to the unlabeled target domain. Semi-supervised learning is a proven strategy for improving UDA performance. In this paper, we propose a novel strong-weak integrated semi-supervision (SWISS) learning strategy for unsupervised domain adaptation. Under the proposed SWISSUDA framework, a strong representative set with high confidence but low diversity target domain samples and a weak representative set with low confidence but high diversity target domain samples are updated constantly during the training process. Both sets are fused randomly to generate an augmented strong-weak training batch with pseudo-labels to train the network during every iteration. Moreover, a novel adversarial logit loss is proposed to reduce the intra-class divergence between source and target domains, which is back-propagated adversarially with a gradient reverse layer between the classifier and the rest of the network. Experimental results based on two popular benchmarks, Office-Home, and DomainNet, show the effectiveness of the proposed SWISS framework with our method achieving the best performance in both Office-Home and DomainNet.\",\"PeriodicalId\":387035,\"journal\":{\"name\":\"2022 IEEE International Conference on Image Processing (ICIP)\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Image Processing (ICIP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIP46576.2022.9897242\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Image Processing (ICIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIP46576.2022.9897242","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

无监督域自适应(Unsupervised domain adaptation, UDA)是一种将源域学习到的知识转移到目标域的方法。半监督学习是一种经过验证的提高UDA性能的策略。本文提出了一种新的用于无监督域自适应的强-弱集成半监督学习策略。在所提出的SWISSUDA框架下,在训练过程中不断更新具有高置信度但低多样性的目标域样本的强代表集和具有低置信度但高多样性的目标域样本的弱代表集。在每次迭代过程中,随机融合这两个训练集,生成一个带伪标签的增强强-弱训练批来训练网络。此外,提出了一种新的对抗逻辑损失来减少源域和目标域之间的类内分歧,该损失通过在分类器和网络其余部分之间的梯度反向传播进行对抗。基于Office-Home和DomainNet两种常用基准测试的实验结果表明,所提出的SWISS框架在Office-Home和DomainNet两种测试中都取得了最佳性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Strong-Weak Integrated Semi-Supervision for Unsupervised Domain Adaptation
Unsupervised domain adaptation (UDA) focuses on transferring knowledge learned in the labeled source domain to the unlabeled target domain. Semi-supervised learning is a proven strategy for improving UDA performance. In this paper, we propose a novel strong-weak integrated semi-supervision (SWISS) learning strategy for unsupervised domain adaptation. Under the proposed SWISSUDA framework, a strong representative set with high confidence but low diversity target domain samples and a weak representative set with low confidence but high diversity target domain samples are updated constantly during the training process. Both sets are fused randomly to generate an augmented strong-weak training batch with pseudo-labels to train the network during every iteration. Moreover, a novel adversarial logit loss is proposed to reduce the intra-class divergence between source and target domains, which is back-propagated adversarially with a gradient reverse layer between the classifier and the rest of the network. Experimental results based on two popular benchmarks, Office-Home, and DomainNet, show the effectiveness of the proposed SWISS framework with our method achieving the best performance in both Office-Home and DomainNet.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信