Transfer Learning for Large Scale Data Using Subspace Alignment

Nassara Elhadji-Ille-Gado, E. Grall-Maës, M. Kharouf
{"title":"Transfer Learning for Large Scale Data Using Subspace Alignment","authors":"Nassara Elhadji-Ille-Gado, E. Grall-Maës, M. Kharouf","doi":"10.1109/ICMLA.2017.00-20","DOIUrl":null,"url":null,"abstract":"A major assumption in many machine learning algorithms is that the training and testing data must come from the same feature space or have the same distributions. However, in real applications, this strong hypothesis does not hold. In this paper, we introduce a new framework for transfer where the source and target domains are represented by subspaces described by eigenvector matrices. To unify subspace distribution between domains, we propose to use a fast efficient approximative SVD for fast features generation. In order to make a transfer learning between domains, we firstly use a subspace learning approach to develop a domain adaption algorithm where only target knowledge is transferable. Secondly, we use subspace alignment trick to propose a novel transfer domain adaptation method. To evaluate the proposal, we use large-scale data sets. Numerical results, based on accuracy and computational time are provided with comparison with state-of-the-art methods.","PeriodicalId":6636,"journal":{"name":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"28 1","pages":"1006-1010"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA.2017.00-20","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

A major assumption in many machine learning algorithms is that the training and testing data must come from the same feature space or have the same distributions. However, in real applications, this strong hypothesis does not hold. In this paper, we introduce a new framework for transfer where the source and target domains are represented by subspaces described by eigenvector matrices. To unify subspace distribution between domains, we propose to use a fast efficient approximative SVD for fast features generation. In order to make a transfer learning between domains, we firstly use a subspace learning approach to develop a domain adaption algorithm where only target knowledge is transferable. Secondly, we use subspace alignment trick to propose a novel transfer domain adaptation method. To evaluate the proposal, we use large-scale data sets. Numerical results, based on accuracy and computational time are provided with comparison with state-of-the-art methods.
基于子空间对齐的大规模数据迁移学习
许多机器学习算法的一个主要假设是训练和测试数据必须来自相同的特征空间或具有相同的分布。然而,在实际应用中,这个强有力的假设并不成立。本文引入了一种新的传输框架,其中源域和目标域由特征向量矩阵描述的子空间表示。为了统一域间的子空间分布,我们提出了一种快速高效的近似奇异值分解来快速生成特征。为了实现领域间的迁移学习,我们首先利用子空间学习方法开发了一种只有目标知识可迁移的领域自适应算法。其次,利用子空间对准技巧提出了一种新的传递域自适应方法。为了评估该建议,我们使用了大规模的数据集。给出了基于精度和计算时间的数值结果,并与现有方法进行了比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信