pratical implementation of deep neural network for facial emotion recognition

Ferroudja Djellali, É. Deljanin
{"title":"pratical implementation of deep neural network for facial emotion recognition","authors":"Ferroudja Djellali, É. Deljanin","doi":"10.54327/set2022/v2.i1.26","DOIUrl":null,"url":null,"abstract":"People's emotions are rarely put into words, far more often they are expressed through other cues. The key to intuiting another's feelings is in the ability to read nonverbal channels, tone of voice, gesture, facial expression and the like. Facial expressions are used by humans to convey various types of meaning in a variety of contexts. The range of meanings extends from basic, probably innate, social-emotional concepts such as \"surprise\" to complex, culture-specific concepts such as \"neglect\". The range of contexts in which humans use facial expressions extends from responses to events in the environment to specific linguistic constructs in sign languages. In this paper, we will use an artificial neural network to classify each image into seven facial emotion classes. The model is trained on a database of FER+ images that we assume is large and diverse enough to indicate which model parameters are generally preferable. The overall results show that, the CNN model is efficient to be able to classify the images according to the state of emotions even in real time.","PeriodicalId":88410,"journal":{"name":"Bubble science engineering and technology","volume":"124 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bubble science engineering and technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54327/set2022/v2.i1.26","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

People's emotions are rarely put into words, far more often they are expressed through other cues. The key to intuiting another's feelings is in the ability to read nonverbal channels, tone of voice, gesture, facial expression and the like. Facial expressions are used by humans to convey various types of meaning in a variety of contexts. The range of meanings extends from basic, probably innate, social-emotional concepts such as "surprise" to complex, culture-specific concepts such as "neglect". The range of contexts in which humans use facial expressions extends from responses to events in the environment to specific linguistic constructs in sign languages. In this paper, we will use an artificial neural network to classify each image into seven facial emotion classes. The model is trained on a database of FER+ images that we assume is large and diverse enough to indicate which model parameters are generally preferable. The overall results show that, the CNN model is efficient to be able to classify the images according to the state of emotions even in real time.
深度神经网络面部情绪识别的实际实现
人们的情绪很少用语言表达出来,更多的时候是通过其他线索表达出来的。直觉他人感受的关键在于读懂非语言渠道、语调、手势、面部表情等的能力。面部表情被人类用来在不同的语境中传达不同类型的意思。它的含义范围从基本的、可能是天生的、社会情感的概念,如“惊讶”,到复杂的、特定文化的概念,如“忽视”。人类使用面部表情的上下文范围从对环境中事件的反应到手语中的特定语言结构。在本文中,我们将使用人工神经网络将每张图像分为七个面部情绪类别。该模型是在FER+图像的数据库上训练的,我们假设该数据库足够大且多样化,以表明哪种模型参数通常更可取。总体结果表明,CNN模型即使在实时情况下也能根据情绪状态对图像进行分类。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信