半监督表示学习:流形正则化自编码器的迁移学习

Yi Zhu, Xuegang Hu, Yuhong Zhang, Peipei Li
{"title":"半监督表示学习:流形正则化自编码器的迁移学习","authors":"Yi Zhu, Xuegang Hu, Yuhong Zhang, Peipei Li","doi":"10.1109/ICBK.2018.00019","DOIUrl":null,"url":null,"abstract":"The excellent performance of transfer learning has emerged in the past few years. How to find feature representations which minimizes the distance between source and target domain is the crucial problem in transfer learning. Recently, deep learning methods have been proposed to learn higher level and robust representation. However, in traditional methods, label information in source domain is not designed to optimize both feature representations and parameters of the learning model. Additionally, data redundance may incur performance degradation on transfer learning. To address these problems, we propose a novel semi-supervised representation learning framework for transfer learning. To obtain this framework, manifold regularization is integrated for the parameters optimization, and the label information is encoded using a softmax regression model in auto-encoders. Meanwhile, whitening layer is introduced to reduce data redundance before auto-encoders. Extensive experiments demonstrate the effectiveness of our proposed framework compared to other competing state-of-the-art baseline methods.","PeriodicalId":144958,"journal":{"name":"2018 IEEE International Conference on Big Knowledge (ICBK)","volume":"111 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Semi-Supervised Representation Learning: Transfer Learning with Manifold Regularized Auto-Encoders\",\"authors\":\"Yi Zhu, Xuegang Hu, Yuhong Zhang, Peipei Li\",\"doi\":\"10.1109/ICBK.2018.00019\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The excellent performance of transfer learning has emerged in the past few years. How to find feature representations which minimizes the distance between source and target domain is the crucial problem in transfer learning. Recently, deep learning methods have been proposed to learn higher level and robust representation. However, in traditional methods, label information in source domain is not designed to optimize both feature representations and parameters of the learning model. Additionally, data redundance may incur performance degradation on transfer learning. To address these problems, we propose a novel semi-supervised representation learning framework for transfer learning. To obtain this framework, manifold regularization is integrated for the parameters optimization, and the label information is encoded using a softmax regression model in auto-encoders. Meanwhile, whitening layer is introduced to reduce data redundance before auto-encoders. Extensive experiments demonstrate the effectiveness of our proposed framework compared to other competing state-of-the-art baseline methods.\",\"PeriodicalId\":144958,\"journal\":{\"name\":\"2018 IEEE International Conference on Big Knowledge (ICBK)\",\"volume\":\"111 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE International Conference on Big Knowledge (ICBK)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICBK.2018.00019\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Big Knowledge (ICBK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBK.2018.00019","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

在过去的几年里,迁移学习的优异表现已经显现出来。如何找到使源域和目标域之间的距离最小的特征表示是迁移学习中的关键问题。近年来,人们提出了深度学习方法来学习更高层次的鲁棒表示。然而,在传统方法中,源域的标签信息并没有同时优化学习模型的特征表示和参数。此外,数据冗余可能会导致迁移学习的性能下降。为了解决这些问题,我们提出了一种新的半监督表示学习框架用于迁移学习。该框架采用流形正则化进行参数优化,并采用softmax回归模型对标签信息进行编码。同时,在自编码器前引入白化层,减少数据冗余。大量的实验表明,与其他竞争的最先进的基线方法相比,我们提出的框架是有效的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Semi-Supervised Representation Learning: Transfer Learning with Manifold Regularized Auto-Encoders
The excellent performance of transfer learning has emerged in the past few years. How to find feature representations which minimizes the distance between source and target domain is the crucial problem in transfer learning. Recently, deep learning methods have been proposed to learn higher level and robust representation. However, in traditional methods, label information in source domain is not designed to optimize both feature representations and parameters of the learning model. Additionally, data redundance may incur performance degradation on transfer learning. To address these problems, we propose a novel semi-supervised representation learning framework for transfer learning. To obtain this framework, manifold regularization is integrated for the parameters optimization, and the label information is encoded using a softmax regression model in auto-encoders. Meanwhile, whitening layer is introduced to reduce data redundance before auto-encoders. Extensive experiments demonstrate the effectiveness of our proposed framework compared to other competing state-of-the-art baseline methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信