Emotion Recognition Using Generative Adversarial Networks

Zhaoqin Peng, Jia Li, Zhengyi Sun
{"title":"Emotion Recognition Using Generative Adversarial Networks","authors":"Zhaoqin Peng, Jia Li, Zhengyi Sun","doi":"10.1109/ICCEIC51584.2020.00023","DOIUrl":null,"url":null,"abstract":"The ability to remotely perform Emotion Recognition in complex scenarios without any particular setup is beneficial to many applications. In recent years, plenty of researchers have proposed some papers related to Emotion Recognition. However, these methods have some limitations. Generally, they must meet the target face richness, have no occlusion, and have consistent lighting. For methods that consider occlusion, imbalanced label distribution, and illumination changes, many strong assumptions about the environment (e.g., remove occluded images, the imbalanced label’s degree is small). This paper proposes an Emotion Recognition method robust to occlusion, imbalanced labels, and user-independent. Specifically, we designed a GAN-based framework to specify labels to generate pictures and restore occluded images, complementing and completing the data manifold. To solve the problem of training instability and provide a reliable training process index, we improved ACGAN. We validate on CK+ and FER2013 datasets, where our approach obtains performance comparable or superior to existing methods.","PeriodicalId":135840,"journal":{"name":"2020 International Conference on Computer Engineering and Intelligent Control (ICCEIC)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Computer Engineering and Intelligent Control (ICCEIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCEIC51584.2020.00023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

The ability to remotely perform Emotion Recognition in complex scenarios without any particular setup is beneficial to many applications. In recent years, plenty of researchers have proposed some papers related to Emotion Recognition. However, these methods have some limitations. Generally, they must meet the target face richness, have no occlusion, and have consistent lighting. For methods that consider occlusion, imbalanced label distribution, and illumination changes, many strong assumptions about the environment (e.g., remove occluded images, the imbalanced label’s degree is small). This paper proposes an Emotion Recognition method robust to occlusion, imbalanced labels, and user-independent. Specifically, we designed a GAN-based framework to specify labels to generate pictures and restore occluded images, complementing and completing the data manifold. To solve the problem of training instability and provide a reliable training process index, we improved ACGAN. We validate on CK+ and FER2013 datasets, where our approach obtains performance comparable or superior to existing methods.
使用生成对抗网络的情绪识别
在没有任何特殊设置的情况下,在复杂场景中远程执行情感识别的能力对许多应用程序都是有益的。近年来,许多研究者提出了一些与情绪识别相关的论文。然而,这些方法有一些局限性。一般来说,它们必须满足目标脸的丰富度,没有遮挡,并且有一致的照明。对于考虑遮挡、不平衡标签分布和光照变化的方法,对环境有许多强假设(例如,去除遮挡图像,不平衡标签的程度很小)。提出了一种对遮挡、不平衡标签和用户无关的鲁棒情绪识别方法。具体来说,我们设计了一个基于gan的框架来指定标签来生成图像和恢复被遮挡的图像,补充和完善数据歧管。为了解决训练不稳定的问题,提供可靠的训练过程指标,我们对ACGAN进行了改进。我们在CK+和FER2013数据集上进行了验证,在这些数据集上,我们的方法获得了与现有方法相当或更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信