结合眼动分析和面部表情模型的情绪识别

V. Huynh, Hyung-Jeong Yang, Gueesang Lee, Soohyung Kim, In Seop Na
{"title":"结合眼动分析和面部表情模型的情绪识别","authors":"V. Huynh, Hyung-Jeong Yang, Gueesang Lee, Soohyung Kim, In Seop Na","doi":"10.1145/3310986.3311001","DOIUrl":null,"url":null,"abstract":"This paper presents an emotion recognition method which combines knowledge from the face and eye movements to improve the system accuracy. Our method has three fundamental stages to recognize the emotion. Firstly, we use a deep learning model to obtain the probability of a sample belonging to each emotion. Then, the eye movement features are extracted from an open-source framework which implements algorithms that demonstrated state-of-the-art results in this task. A new set of 51 features have been used to obtain related information about each emotion for the corresponding sample. Finally, the emotion for a sample is recognized based on the combination of the knowledge from the two previous stages. Experiment on the validation set of Acted Facial Expressions in the Wild (AFEW) dataset shows that the eye movements can make 2.87% improvement in the accuracy for the face model.","PeriodicalId":252781,"journal":{"name":"Proceedings of the 3rd International Conference on Machine Learning and Soft Computing","volume":"78 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Emotion Recognition by Integrating Eye Movement Analysis and Facial Expression Model\",\"authors\":\"V. Huynh, Hyung-Jeong Yang, Gueesang Lee, Soohyung Kim, In Seop Na\",\"doi\":\"10.1145/3310986.3311001\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an emotion recognition method which combines knowledge from the face and eye movements to improve the system accuracy. Our method has three fundamental stages to recognize the emotion. Firstly, we use a deep learning model to obtain the probability of a sample belonging to each emotion. Then, the eye movement features are extracted from an open-source framework which implements algorithms that demonstrated state-of-the-art results in this task. A new set of 51 features have been used to obtain related information about each emotion for the corresponding sample. Finally, the emotion for a sample is recognized based on the combination of the knowledge from the two previous stages. Experiment on the validation set of Acted Facial Expressions in the Wild (AFEW) dataset shows that the eye movements can make 2.87% improvement in the accuracy for the face model.\",\"PeriodicalId\":252781,\"journal\":{\"name\":\"Proceedings of the 3rd International Conference on Machine Learning and Soft Computing\",\"volume\":\"78 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-01-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 3rd International Conference on Machine Learning and Soft Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3310986.3311001\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd International Conference on Machine Learning and Soft Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3310986.3311001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

本文提出了一种结合面部和眼球运动知识的情感识别方法,以提高系统的准确率。我们的方法有三个基本阶段来识别情绪。首先,我们使用深度学习模型来获得样本属于每种情绪的概率。然后,从一个开源框架中提取眼球运动特征,该框架实现了在该任务中展示最先进结果的算法。一组新的51个特征被用来获取每个情绪对应样本的相关信息。最后,结合前两个阶段的知识来识别样本的情绪。在野生(AFEW)数据集的面部表情验证集上进行的实验表明,眼球运动可以使人脸模型的准确率提高2.87%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Emotion Recognition by Integrating Eye Movement Analysis and Facial Expression Model
This paper presents an emotion recognition method which combines knowledge from the face and eye movements to improve the system accuracy. Our method has three fundamental stages to recognize the emotion. Firstly, we use a deep learning model to obtain the probability of a sample belonging to each emotion. Then, the eye movement features are extracted from an open-source framework which implements algorithms that demonstrated state-of-the-art results in this task. A new set of 51 features have been used to obtain related information about each emotion for the corresponding sample. Finally, the emotion for a sample is recognized based on the combination of the knowledge from the two previous stages. Experiment on the validation set of Acted Facial Expressions in the Wild (AFEW) dataset shows that the eye movements can make 2.87% improvement in the accuracy for the face model.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信