Andrinandrasana David Rasamoelina, F. Adjailia, P. Sinčák
{"title":"鲁棒面部情绪识别的深度卷积神经网络","authors":"Andrinandrasana David Rasamoelina, F. Adjailia, P. Sinčák","doi":"10.1109/INISTA.2019.8778282","DOIUrl":null,"url":null,"abstract":"Emotion and the ability to understand them are considered a channel of non-verbal communication. It is an important factor to achieve a smooth and yet robust interaction between machines and humans. In this paper, we review CNN-based methods for facial emotion recognition and we propose a new cutting edge deep learning approach to classify facial expressions from pictures. To guarantee the efficacy of the method, we used multiple datasets: FER2013, AffectNet, RaFD, and KDEF. We obtained results respectively 82.3%, 76.79%, 78.58 %, and 77.08 %. Those results surpassed the current state of the art. We also compared our achieved measurements to available APIs for facial emotion recognition.","PeriodicalId":262143,"journal":{"name":"2019 IEEE International Symposium on INnovations in Intelligent SysTems and Applications (INISTA)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Deep Convolutional Neural Network for Robust Facial Emotion Recognition\",\"authors\":\"Andrinandrasana David Rasamoelina, F. Adjailia, P. Sinčák\",\"doi\":\"10.1109/INISTA.2019.8778282\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Emotion and the ability to understand them are considered a channel of non-verbal communication. It is an important factor to achieve a smooth and yet robust interaction between machines and humans. In this paper, we review CNN-based methods for facial emotion recognition and we propose a new cutting edge deep learning approach to classify facial expressions from pictures. To guarantee the efficacy of the method, we used multiple datasets: FER2013, AffectNet, RaFD, and KDEF. We obtained results respectively 82.3%, 76.79%, 78.58 %, and 77.08 %. Those results surpassed the current state of the art. We also compared our achieved measurements to available APIs for facial emotion recognition.\",\"PeriodicalId\":262143,\"journal\":{\"name\":\"2019 IEEE International Symposium on INnovations in Intelligent SysTems and Applications (INISTA)\",\"volume\":\"45 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-07-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Symposium on INnovations in Intelligent SysTems and Applications (INISTA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/INISTA.2019.8778282\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Symposium on INnovations in Intelligent SysTems and Applications (INISTA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INISTA.2019.8778282","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deep Convolutional Neural Network for Robust Facial Emotion Recognition
Emotion and the ability to understand them are considered a channel of non-verbal communication. It is an important factor to achieve a smooth and yet robust interaction between machines and humans. In this paper, we review CNN-based methods for facial emotion recognition and we propose a new cutting edge deep learning approach to classify facial expressions from pictures. To guarantee the efficacy of the method, we used multiple datasets: FER2013, AffectNet, RaFD, and KDEF. We obtained results respectively 82.3%, 76.79%, 78.58 %, and 77.08 %. Those results surpassed the current state of the art. We also compared our achieved measurements to available APIs for facial emotion recognition.