DiDA: Iterative Boosting of Disentangled Synthesis and Domain Adaptation

Jinming Cao, Oren Katzir, Peng Jiang, D. Lischinski, D. Cohen-Or, Changhe Tu, Yangyan Li
{"title":"DiDA: Iterative Boosting of Disentangled Synthesis and Domain Adaptation","authors":"Jinming Cao, Oren Katzir, Peng Jiang, D. Lischinski, D. Cohen-Or, Changhe Tu, Yangyan Li","doi":"10.1109/ITME53901.2021.00049","DOIUrl":null,"url":null,"abstract":"Unsupervised domain adaptation aims at learning a shared model for two related domains by leveraging supervision from a source domain to an unsupervised target domain. A number of effective domain adaptation approaches rely on the ability to extract domain-invariant latent factors which are common to both domains. Extracting latent commonality is also useful for disentanglement analysis. It enables separation between the common and the domain-specific features of both domains, which can be recombined for synthesis. In this paper, we propose a strategy to boost the performance of domain adaptation and disentangled synthesis iteratively. The key idea is that by learning to separately extract both the common and the domain-specific features, one can synthesize more target domain data with supervision, thereby boosting the domain adaptation performance. Better common feature extraction, in turn, helps further improve the feature disentanglement and the following disentangled synthesis. We show that iterating between domain adaptation and disentangled synthesis can consistently improve each other on several unsupervised domain adaptation benchmark datasets and tasks, under various domain adaptation backbone models.","PeriodicalId":6774,"journal":{"name":"2021 11th International Conference on Information Technology in Medicine and Education (ITME)","volume":"29 1","pages":"201-208"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 11th International Conference on Information Technology in Medicine and Education (ITME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITME53901.2021.00049","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Unsupervised domain adaptation aims at learning a shared model for two related domains by leveraging supervision from a source domain to an unsupervised target domain. A number of effective domain adaptation approaches rely on the ability to extract domain-invariant latent factors which are common to both domains. Extracting latent commonality is also useful for disentanglement analysis. It enables separation between the common and the domain-specific features of both domains, which can be recombined for synthesis. In this paper, we propose a strategy to boost the performance of domain adaptation and disentangled synthesis iteratively. The key idea is that by learning to separately extract both the common and the domain-specific features, one can synthesize more target domain data with supervision, thereby boosting the domain adaptation performance. Better common feature extraction, in turn, helps further improve the feature disentanglement and the following disentangled synthesis. We show that iterating between domain adaptation and disentangled synthesis can consistently improve each other on several unsupervised domain adaptation benchmark datasets and tasks, under various domain adaptation backbone models.
DiDA:解纠缠综合和领域自适应的迭代增强
无监督域自适应的目的是利用源域到无监督目标域的监督,学习两个相关域的共享模型。许多有效的领域自适应方法依赖于提取两个领域共同的领域不变潜在因素的能力。提取潜在的共性对解纠缠分析也很有用。它支持两个领域的公共和特定于领域的特征之间的分离,这些特征可以重新组合以进行综合。在本文中,我们提出了一种迭代提高领域自适应和解纠缠综合性能的策略。关键思想是通过学习分别提取共同特征和特定领域特征,在监督下合成更多目标领域数据,从而提高领域自适应性能。更好的公共特征提取反过来又有助于进一步改进特征解纠缠和随后的解纠缠合成。研究表明,在不同的领域自适应骨干模型下,在多个无监督的领域自适应基准数据集和任务上,领域自适应和解纠缠综合之间的迭代可以持续地相互改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信