用于迁移学习的多源域公共特征提取

J. Tahmoresnezhad, S. Hashemi
{"title":"用于迁移学习的多源域公共特征提取","authors":"J. Tahmoresnezhad, S. Hashemi","doi":"10.1109/IKT.2015.7288795","DOIUrl":null,"url":null,"abstract":"In transfer learning scenarios, finding a common feature representation is crucial to tackle the problem of domain shift where the training (source domain) and test (target domain) sets have difference in their distribution. However, classical dimensionality reduction approaches such as Fisher Discriminant Analysis (FDA), are not in good yields whenever dealing with shift problem. In this paper we introduce CoMuT, a Common feature extraction in Multi-source domains for Transfer learning, that finds a common feature representation between different source and target domains. CoMuT projects the data into a latent space to reduce the drift in distributions across domains and concurrently preserves the separability between classes. CoMuT constructs the latent space in semi-supervised manner to bridge across domains and relate the different domains to each other. The projected domains have distribution similarity and classical machine learning methods can be applied on them to classify target data. Empirical results indicate that CoMuT outperforms other dimensionality reduction methods on different artificial and real datasets.","PeriodicalId":338953,"journal":{"name":"2015 7th Conference on Information and Knowledge Technology (IKT)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Common feature extraction in multi-source domains for transfer learning\",\"authors\":\"J. Tahmoresnezhad, S. Hashemi\",\"doi\":\"10.1109/IKT.2015.7288795\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In transfer learning scenarios, finding a common feature representation is crucial to tackle the problem of domain shift where the training (source domain) and test (target domain) sets have difference in their distribution. However, classical dimensionality reduction approaches such as Fisher Discriminant Analysis (FDA), are not in good yields whenever dealing with shift problem. In this paper we introduce CoMuT, a Common feature extraction in Multi-source domains for Transfer learning, that finds a common feature representation between different source and target domains. CoMuT projects the data into a latent space to reduce the drift in distributions across domains and concurrently preserves the separability between classes. CoMuT constructs the latent space in semi-supervised manner to bridge across domains and relate the different domains to each other. The projected domains have distribution similarity and classical machine learning methods can be applied on them to classify target data. Empirical results indicate that CoMuT outperforms other dimensionality reduction methods on different artificial and real datasets.\",\"PeriodicalId\":338953,\"journal\":{\"name\":\"2015 7th Conference on Information and Knowledge Technology (IKT)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-05-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 7th Conference on Information and Knowledge Technology (IKT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IKT.2015.7288795\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 7th Conference on Information and Knowledge Technology (IKT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IKT.2015.7288795","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

在迁移学习场景中,当训练集(源域)和测试集(目标域)的分布存在差异时,找到一个通用的特征表示对于解决域转移问题至关重要。然而,经典的降维方法,如Fisher判别分析(FDA),在处理移位问题时效果不佳。本文介绍了一种用于迁移学习的多源域共同特征提取方法CoMuT,它在不同的源域和目标域之间找到一个共同的特征表示。CoMuT将数据投影到潜在空间中,以减少分布跨域的漂移,同时保持类之间的可分离性。CoMuT以半监督的方式构建潜在空间,在不同的领域之间架起桥梁,并将不同的领域联系起来。投影域具有分布相似性,可以应用经典的机器学习方法对目标数据进行分类。实证结果表明,在不同的人工数据集和真实数据集上,CoMuT的降维效果优于其他降维方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Common feature extraction in multi-source domains for transfer learning
In transfer learning scenarios, finding a common feature representation is crucial to tackle the problem of domain shift where the training (source domain) and test (target domain) sets have difference in their distribution. However, classical dimensionality reduction approaches such as Fisher Discriminant Analysis (FDA), are not in good yields whenever dealing with shift problem. In this paper we introduce CoMuT, a Common feature extraction in Multi-source domains for Transfer learning, that finds a common feature representation between different source and target domains. CoMuT projects the data into a latent space to reduce the drift in distributions across domains and concurrently preserves the separability between classes. CoMuT constructs the latent space in semi-supervised manner to bridge across domains and relate the different domains to each other. The projected domains have distribution similarity and classical machine learning methods can be applied on them to classify target data. Empirical results indicate that CoMuT outperforms other dimensionality reduction methods on different artificial and real datasets.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信