Temporal based Emotion Recognition inspired by Activity Recognition models

Balaganesh Mohan, Mirela C. Popa
{"title":"Temporal based Emotion Recognition inspired by Activity Recognition models","authors":"Balaganesh Mohan, Mirela C. Popa","doi":"10.1109/aciiw52867.2021.9666356","DOIUrl":null,"url":null,"abstract":"Affective computing is a subset of the larger field of human-computer interaction, having important connections with cognitive processes, influencing the learning process, decision-making and perception. Out of the multiple means of communication, facial expressions are one of the most widely accepted channels for emotion modulation, receiving an increased attention during the last few years. An important aspect, contributing to their recognition success, concerns modeling the temporal dimension. Therefore, this paper aims to investigate the applicability of current state-of-the-art action recognition techniques to the human emotion recognition task. In particular, two different architectures were investigated, a CNN-based model, named Temporal Shift Module (TSM) that can learn spatiotemporal features in 3D data with the computational complexity of a 2D CNN and a video based vision transformer, employing spatio-temporal self attention. The models were trained and tested on the CREMA-D dataset, demonstrating state-of-the-art performance, with a mean class accuracy of 82% and 77% respectively, while outperforming best previous approaches by at least 3.5%.","PeriodicalId":105376,"journal":{"name":"2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/aciiw52867.2021.9666356","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Affective computing is a subset of the larger field of human-computer interaction, having important connections with cognitive processes, influencing the learning process, decision-making and perception. Out of the multiple means of communication, facial expressions are one of the most widely accepted channels for emotion modulation, receiving an increased attention during the last few years. An important aspect, contributing to their recognition success, concerns modeling the temporal dimension. Therefore, this paper aims to investigate the applicability of current state-of-the-art action recognition techniques to the human emotion recognition task. In particular, two different architectures were investigated, a CNN-based model, named Temporal Shift Module (TSM) that can learn spatiotemporal features in 3D data with the computational complexity of a 2D CNN and a video based vision transformer, employing spatio-temporal self attention. The models were trained and tested on the CREMA-D dataset, demonstrating state-of-the-art performance, with a mean class accuracy of 82% and 77% respectively, while outperforming best previous approaches by at least 3.5%.
受活动识别模型启发的基于时间的情绪识别
情感计算是更大的人机交互领域的一个子集,与认知过程有着重要的联系,影响着学习过程、决策和感知。在多种交流方式中,面部表情是最被广泛接受的情绪调节渠道之一,在过去几年中受到越来越多的关注。一个重要的方面,有助于他们的识别成功,涉及建模的时间维度。因此,本文旨在研究当前最先进的动作识别技术在人类情感识别任务中的适用性。特别地,研究了两种不同的架构,一种是基于CNN的模型,称为时间移位模块(TSM),它可以以2D CNN的计算复杂度学习3D数据中的时空特征,另一种是基于视频的视觉转换器,利用时空自注意。这些模型在CREMA-D数据集上进行了训练和测试,展示了最先进的性能,平均分类准确率分别为82%和77%,而比以前的最佳方法至少高出3.5%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信