基于深度学习的mobilenet和多头注意模型的面部表情识别

Aicha Nouisser, Ramzi Zouari, M. Kherallah
{"title":"基于深度学习的mobilenet和多头注意模型的面部表情识别","authors":"Aicha Nouisser, Ramzi Zouari, M. Kherallah","doi":"10.34028/iajit/20/3a/6","DOIUrl":null,"url":null,"abstract":"Facial expressions is an intuitive reflection of a person’s emotional state, and it is one of the most important forms of interpersonal communication. Due to the complexity and variability of human facial expressions, traditional methods based on handcrafted feature extraction have shown insufficient performances. For this purpose, we proposed a new system of facial expression recognition based on MobileNet model with the addition of skip connections to prevent the degradation in performance in deeper architectures. Moreover, multi-head attention mechanism was applied to concentrate the processing on the most relevant parts of the image. The experiments were conducted on FER2013 database, which is imbalanced and includes ambiguities in some images containing synthetic faces. We applied a pre-processing step of face detection to eliminate wrong images, and we implemented both SMOTE and Near-Miss algorithms to get a balanced dataset and prevent the model to being biased. The experimental results showed the effectiveness of the proposed framework which achieved the recognition rate of 96.02% when applying multi-head attention mechanism","PeriodicalId":13624,"journal":{"name":"Int. Arab J. Inf. Technol.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep learning based mobilenet and multi-head attention model for facial expression recognition\",\"authors\":\"Aicha Nouisser, Ramzi Zouari, M. Kherallah\",\"doi\":\"10.34028/iajit/20/3a/6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Facial expressions is an intuitive reflection of a person’s emotional state, and it is one of the most important forms of interpersonal communication. Due to the complexity and variability of human facial expressions, traditional methods based on handcrafted feature extraction have shown insufficient performances. For this purpose, we proposed a new system of facial expression recognition based on MobileNet model with the addition of skip connections to prevent the degradation in performance in deeper architectures. Moreover, multi-head attention mechanism was applied to concentrate the processing on the most relevant parts of the image. The experiments were conducted on FER2013 database, which is imbalanced and includes ambiguities in some images containing synthetic faces. We applied a pre-processing step of face detection to eliminate wrong images, and we implemented both SMOTE and Near-Miss algorithms to get a balanced dataset and prevent the model to being biased. The experimental results showed the effectiveness of the proposed framework which achieved the recognition rate of 96.02% when applying multi-head attention mechanism\",\"PeriodicalId\":13624,\"journal\":{\"name\":\"Int. Arab J. Inf. Technol.\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Int. Arab J. Inf. Technol.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.34028/iajit/20/3a/6\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. Arab J. Inf. Technol.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.34028/iajit/20/3a/6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

面部表情是一个人情绪状态的直观反映,是人际交往的重要形式之一。由于人类面部表情的复杂性和可变性,传统的基于手工特征提取的方法表现出不足的性能。为此,我们提出了一种新的基于MobileNet模型的面部表情识别系统,并增加了跳跃连接,以防止在更深层次的架构中性能下降。此外,采用多头注意机制,将处理集中在图像最相关的部分。实验是在FER2013数据库上进行的,该数据库不平衡,并且在一些包含合成人脸的图像中存在歧义。我们采用人脸检测的预处理步骤来消除错误的图像,我们实现了SMOTE和Near-Miss算法来获得一个平衡的数据集,并防止模型偏差。实验结果表明,采用多头注意机制时,该框架的识别率达到96.02%
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Deep learning based mobilenet and multi-head attention model for facial expression recognition
Facial expressions is an intuitive reflection of a person’s emotional state, and it is one of the most important forms of interpersonal communication. Due to the complexity and variability of human facial expressions, traditional methods based on handcrafted feature extraction have shown insufficient performances. For this purpose, we proposed a new system of facial expression recognition based on MobileNet model with the addition of skip connections to prevent the degradation in performance in deeper architectures. Moreover, multi-head attention mechanism was applied to concentrate the processing on the most relevant parts of the image. The experiments were conducted on FER2013 database, which is imbalanced and includes ambiguities in some images containing synthetic faces. We applied a pre-processing step of face detection to eliminate wrong images, and we implemented both SMOTE and Near-Miss algorithms to get a balanced dataset and prevent the model to being biased. The experimental results showed the effectiveness of the proposed framework which achieved the recognition rate of 96.02% when applying multi-head attention mechanism
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信