Nassara Elhadji-Ille-Gado, E. Grall-Maës, M. Kharouf
{"title":"基于子空间对齐的大规模数据迁移学习","authors":"Nassara Elhadji-Ille-Gado, E. Grall-Maës, M. Kharouf","doi":"10.1109/ICMLA.2017.00-20","DOIUrl":null,"url":null,"abstract":"A major assumption in many machine learning algorithms is that the training and testing data must come from the same feature space or have the same distributions. However, in real applications, this strong hypothesis does not hold. In this paper, we introduce a new framework for transfer where the source and target domains are represented by subspaces described by eigenvector matrices. To unify subspace distribution between domains, we propose to use a fast efficient approximative SVD for fast features generation. In order to make a transfer learning between domains, we firstly use a subspace learning approach to develop a domain adaption algorithm where only target knowledge is transferable. Secondly, we use subspace alignment trick to propose a novel transfer domain adaptation method. To evaluate the proposal, we use large-scale data sets. Numerical results, based on accuracy and computational time are provided with comparison with state-of-the-art methods.","PeriodicalId":6636,"journal":{"name":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"28 1","pages":"1006-1010"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Transfer Learning for Large Scale Data Using Subspace Alignment\",\"authors\":\"Nassara Elhadji-Ille-Gado, E. Grall-Maës, M. Kharouf\",\"doi\":\"10.1109/ICMLA.2017.00-20\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A major assumption in many machine learning algorithms is that the training and testing data must come from the same feature space or have the same distributions. However, in real applications, this strong hypothesis does not hold. In this paper, we introduce a new framework for transfer where the source and target domains are represented by subspaces described by eigenvector matrices. To unify subspace distribution between domains, we propose to use a fast efficient approximative SVD for fast features generation. In order to make a transfer learning between domains, we firstly use a subspace learning approach to develop a domain adaption algorithm where only target knowledge is transferable. Secondly, we use subspace alignment trick to propose a novel transfer domain adaptation method. To evaluate the proposal, we use large-scale data sets. Numerical results, based on accuracy and computational time are provided with comparison with state-of-the-art methods.\",\"PeriodicalId\":6636,\"journal\":{\"name\":\"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"volume\":\"28 1\",\"pages\":\"1006-1010\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLA.2017.00-20\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA.2017.00-20","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Transfer Learning for Large Scale Data Using Subspace Alignment
A major assumption in many machine learning algorithms is that the training and testing data must come from the same feature space or have the same distributions. However, in real applications, this strong hypothesis does not hold. In this paper, we introduce a new framework for transfer where the source and target domains are represented by subspaces described by eigenvector matrices. To unify subspace distribution between domains, we propose to use a fast efficient approximative SVD for fast features generation. In order to make a transfer learning between domains, we firstly use a subspace learning approach to develop a domain adaption algorithm where only target knowledge is transferable. Secondly, we use subspace alignment trick to propose a novel transfer domain adaptation method. To evaluate the proposal, we use large-scale data sets. Numerical results, based on accuracy and computational time are provided with comparison with state-of-the-art methods.