{"title":"Nonlinear Cross-Domain Feature Representation Learning Method Based on Dual Constraints","authors":"Han Ding, Yuhong Zhang, Shuai Yang, Yaojin Lin","doi":"10.1109/ICBK.2019.00017","DOIUrl":null,"url":null,"abstract":"Feature representation learning is a research focus in domain adaptation. Recently, due to the fast training speed, the marginalized Denoising Autoencoder (mDA) as a standing deep learning model has been widely utilized for feature representation learning. However, the training of mDA suffers from the lack of nonlinear relationship and does not explicitly consider the distribution discrepancy between domains. To address these problems, this paper proposes a novel method for feature representation learning, namely Nonlinear cross-domain Feature learning based Dual Constraints (NFDC), which consists of kernelization and dual constraints. Firstly, we introduce kernelization to effectively extract nonlinear relationship in feature representation learning. Secondly, we design dual constraints including Maximum Mean Discrepancy (MMD) and Manifold Regularization (MR) in order to minimize distribution discrepancy during the training process. Experimental results show that our approach is superior to several state-of-the-art methods in domain adaptation tasks.","PeriodicalId":383917,"journal":{"name":"2019 IEEE International Conference on Big Knowledge (ICBK)","volume":"159 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Big Knowledge (ICBK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBK.2019.00017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Feature representation learning is a research focus in domain adaptation. Recently, due to the fast training speed, the marginalized Denoising Autoencoder (mDA) as a standing deep learning model has been widely utilized for feature representation learning. However, the training of mDA suffers from the lack of nonlinear relationship and does not explicitly consider the distribution discrepancy between domains. To address these problems, this paper proposes a novel method for feature representation learning, namely Nonlinear cross-domain Feature learning based Dual Constraints (NFDC), which consists of kernelization and dual constraints. Firstly, we introduce kernelization to effectively extract nonlinear relationship in feature representation learning. Secondly, we design dual constraints including Maximum Mean Discrepancy (MMD) and Manifold Regularization (MR) in order to minimize distribution discrepancy during the training process. Experimental results show that our approach is superior to several state-of-the-art methods in domain adaptation tasks.