基于多特征表示集成的迁移学习

Hang Zhao, Qing Liu, Yun Yang
{"title":"基于多特征表示集成的迁移学习","authors":"Hang Zhao, Qing Liu, Yun Yang","doi":"10.1109/SERA.2018.8477189","DOIUrl":null,"url":null,"abstract":"Supervised learning algorithms are to discover the hidden patterns of the statistics, assuming that the training data and the test data are from the same distribution. There are two challenges in the traditional supervised machine learning. One is that the test data distribution always differs largely from the training data distribution in the real world, while another is that there is usually very few labeled data to train a machine learning model. In such cases, transfer learning, which emphasizes the transfer of the previous knowledge from different but related domains and tasks, is recommended to deal with these problems. Traditional transfer learning methods care more about the data itself rather than the task. In fact, there is no one universal feature representation can perfectly benefit the model training work. But different feature representations can discover some independent latent knowledge from the original data. In this paper, we propose an instance-based transfer learning method, which is a weighted ensemble transfer learning framework with multiple feature representations. In our work, mutual information is applied as the smart weighting schema to measure the weight of each feature representation. Extensive experiments have been conducted on three facial expression recognition data sets: JAFFE, KDEF and FERG-DB. The experimental results demonstrate that our approach achieves better performance than the traditional transfer learning method and the non-transfer learning method.","PeriodicalId":161568,"journal":{"name":"2018 IEEE 16th International Conference on Software Engineering Research, Management and Applications (SERA)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":"{\"title\":\"Transfer Learning with Ensemble of Multiple Feature Representations\",\"authors\":\"Hang Zhao, Qing Liu, Yun Yang\",\"doi\":\"10.1109/SERA.2018.8477189\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Supervised learning algorithms are to discover the hidden patterns of the statistics, assuming that the training data and the test data are from the same distribution. There are two challenges in the traditional supervised machine learning. One is that the test data distribution always differs largely from the training data distribution in the real world, while another is that there is usually very few labeled data to train a machine learning model. In such cases, transfer learning, which emphasizes the transfer of the previous knowledge from different but related domains and tasks, is recommended to deal with these problems. Traditional transfer learning methods care more about the data itself rather than the task. In fact, there is no one universal feature representation can perfectly benefit the model training work. But different feature representations can discover some independent latent knowledge from the original data. In this paper, we propose an instance-based transfer learning method, which is a weighted ensemble transfer learning framework with multiple feature representations. In our work, mutual information is applied as the smart weighting schema to measure the weight of each feature representation. Extensive experiments have been conducted on three facial expression recognition data sets: JAFFE, KDEF and FERG-DB. The experimental results demonstrate that our approach achieves better performance than the traditional transfer learning method and the non-transfer learning method.\",\"PeriodicalId\":161568,\"journal\":{\"name\":\"2018 IEEE 16th International Conference on Software Engineering Research, Management and Applications (SERA)\",\"volume\":\"50 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"17\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE 16th International Conference on Software Engineering Research, Management and Applications (SERA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SERA.2018.8477189\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 16th International Conference on Software Engineering Research, Management and Applications (SERA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SERA.2018.8477189","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17

摘要

监督学习算法是在假设训练数据和测试数据来自同一分布的情况下,发现统计数据的隐藏模式。传统的监督式机器学习存在两个挑战。一个是测试数据分布总是与现实世界中的训练数据分布有很大的不同,另一个是通常很少有标记数据来训练机器学习模型。在这种情况下,迁移学习被推荐用于解决这些问题,它强调从不同但相关的领域和任务中迁移先前的知识。传统的迁移学习方法更关心数据本身,而不是任务。事实上,没有一种通用的特征表示可以完美地造福于模型训练工作。但不同的特征表示可以从原始数据中发现一些独立的潜在知识。本文提出了一种基于实例的迁移学习方法,该方法是一种带有多个特征表示的加权集成迁移学习框架。在我们的工作中,采用互信息作为智能加权模式来衡量每个特征表示的权重。在JAFFE、KDEF和FERG-DB三种面部表情识别数据集上进行了大量的实验。实验结果表明,该方法比传统的迁移学习方法和非迁移学习方法取得了更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Transfer Learning with Ensemble of Multiple Feature Representations
Supervised learning algorithms are to discover the hidden patterns of the statistics, assuming that the training data and the test data are from the same distribution. There are two challenges in the traditional supervised machine learning. One is that the test data distribution always differs largely from the training data distribution in the real world, while another is that there is usually very few labeled data to train a machine learning model. In such cases, transfer learning, which emphasizes the transfer of the previous knowledge from different but related domains and tasks, is recommended to deal with these problems. Traditional transfer learning methods care more about the data itself rather than the task. In fact, there is no one universal feature representation can perfectly benefit the model training work. But different feature representations can discover some independent latent knowledge from the original data. In this paper, we propose an instance-based transfer learning method, which is a weighted ensemble transfer learning framework with multiple feature representations. In our work, mutual information is applied as the smart weighting schema to measure the weight of each feature representation. Extensive experiments have been conducted on three facial expression recognition data sets: JAFFE, KDEF and FERG-DB. The experimental results demonstrate that our approach achieves better performance than the traditional transfer learning method and the non-transfer learning method.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信