用于动态微表情分类的分层分离和分类网络

IF 4.5 2区 计算机科学 Q1 COMPUTER SCIENCE, CYBERNETICS
Jordan Vice;Masood Mehmood Khan;Tele Tan;Iain Murray;Svetlana Yanushkevich
{"title":"用于动态微表情分类的分层分离和分类网络","authors":"Jordan Vice;Masood Mehmood Khan;Tele Tan;Iain Murray;Svetlana Yanushkevich","doi":"10.1109/TCSS.2023.3334823","DOIUrl":null,"url":null,"abstract":"Macrolevel facial muscle variations, as used for building models of seven discrete facial expressions, suffice when distinguishing between macrolevel human affective states but won’t discretise continuous and dynamic microlevel variations in facial expressions. We present a hierarchical separation and classification network (HSCN) for discovering dynamic, continuous, and macro- and microlevel variations in facial expressions of affective states. In the HSCN, we first invoke an unsupervised cosine similarity-based separation method on continuous facial expression data to extract twenty-one dynamic facial expression classes from the seven common discrete affective states. The between-clusters separation is then optimized for discovering the macrolevel changes resulting from facial muscle activations. A following step in the HSCN separates the upper and lower facial regions for realizing changes pertaining to upper and lower facial muscle activations. Data from the two separated facial regions are then clustered in a linear discriminant space using similarities in muscular activation patterns. Next, the actual dynamic expression data are mapped onto discriminant features for developing a rule-based expert system that facilitates classifying twenty-one upper and twenty-one lower microexpressions. Invoking the random forest algorithm would classify twenty-one macrolevel facial expressions with 76.11% accuracy. A support vector machine (SVM), used separately on upper and lower facial regions in tandem, could classify them with respective accuracies of 73.63% and 87.68%. This work demonstrates a novel and effective method of dynamic assessment of affective states. The HSCN further demonstrates that facial muscle variations gathered from either upper, lower, or full-face would suffice classifying affective states. We also provide new insight into discovery of microlevel facial muscle variations and their utilization in dynamic assessment of facial expressions of affective states.","PeriodicalId":13044,"journal":{"name":"IEEE Transactions on Computational Social Systems","volume":null,"pages":null},"PeriodicalIF":4.5000,"publicationDate":"2024-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10400570","citationCount":"0","resultStr":"{\"title\":\"A Hierarchical Separation and Classification Network for Dynamic Microexpression Classification\",\"authors\":\"Jordan Vice;Masood Mehmood Khan;Tele Tan;Iain Murray;Svetlana Yanushkevich\",\"doi\":\"10.1109/TCSS.2023.3334823\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Macrolevel facial muscle variations, as used for building models of seven discrete facial expressions, suffice when distinguishing between macrolevel human affective states but won’t discretise continuous and dynamic microlevel variations in facial expressions. We present a hierarchical separation and classification network (HSCN) for discovering dynamic, continuous, and macro- and microlevel variations in facial expressions of affective states. In the HSCN, we first invoke an unsupervised cosine similarity-based separation method on continuous facial expression data to extract twenty-one dynamic facial expression classes from the seven common discrete affective states. The between-clusters separation is then optimized for discovering the macrolevel changes resulting from facial muscle activations. A following step in the HSCN separates the upper and lower facial regions for realizing changes pertaining to upper and lower facial muscle activations. Data from the two separated facial regions are then clustered in a linear discriminant space using similarities in muscular activation patterns. Next, the actual dynamic expression data are mapped onto discriminant features for developing a rule-based expert system that facilitates classifying twenty-one upper and twenty-one lower microexpressions. Invoking the random forest algorithm would classify twenty-one macrolevel facial expressions with 76.11% accuracy. A support vector machine (SVM), used separately on upper and lower facial regions in tandem, could classify them with respective accuracies of 73.63% and 87.68%. This work demonstrates a novel and effective method of dynamic assessment of affective states. The HSCN further demonstrates that facial muscle variations gathered from either upper, lower, or full-face would suffice classifying affective states. We also provide new insight into discovery of microlevel facial muscle variations and their utilization in dynamic assessment of facial expressions of affective states.\",\"PeriodicalId\":13044,\"journal\":{\"name\":\"IEEE Transactions on Computational Social Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2024-01-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10400570\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Computational Social Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10400570/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, CYBERNETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Computational Social Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10400570/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

摘要

用于建立七种离散面部表情模型的宏观面部肌肉变化足以区分宏观的人类情感状态,但无法离散面部表情中连续和动态的微观变化。我们提出了一种分层分离和分类网络(HSCN),用于发现情感状态面部表情中的动态、连续以及宏观和微观层面的变化。在 HSCN 中,我们首先对连续的面部表情数据采用基于余弦相似性的无监督分离方法,从七种常见的离散情感状态中提取出 21 个动态面部表情类别。然后优化簇间分离,以发现面部肌肉激活所产生的宏观变化。HSCN 的下一步是分离上下面部区域,以实现与上下面部肌肉激活相关的变化。然后,利用肌肉激活模式的相似性,在线性判别空间中对来自两个分离的面部区域的数据进行聚类。然后,将实际动态表情数据映射到判别特征上,开发出基于规则的专家系统,帮助对 21 种上部和 21 种下部微表情进行分类。采用随机森林算法对 21 种宏观面部表情进行分类的准确率为 76.11%。支持向量机(SVM)可分别对上部和下部面部区域进行分类,准确率分别为 73.63% 和 87.68%。这项工作展示了一种新颖有效的情感状态动态评估方法。HSCN 进一步证明,从上部、下部或整个面部收集到的面部肌肉变化足以对情感状态进行分类。我们还为微观面部肌肉变化的发现及其在情感状态面部表情动态评估中的应用提供了新的见解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Hierarchical Separation and Classification Network for Dynamic Microexpression Classification
Macrolevel facial muscle variations, as used for building models of seven discrete facial expressions, suffice when distinguishing between macrolevel human affective states but won’t discretise continuous and dynamic microlevel variations in facial expressions. We present a hierarchical separation and classification network (HSCN) for discovering dynamic, continuous, and macro- and microlevel variations in facial expressions of affective states. In the HSCN, we first invoke an unsupervised cosine similarity-based separation method on continuous facial expression data to extract twenty-one dynamic facial expression classes from the seven common discrete affective states. The between-clusters separation is then optimized for discovering the macrolevel changes resulting from facial muscle activations. A following step in the HSCN separates the upper and lower facial regions for realizing changes pertaining to upper and lower facial muscle activations. Data from the two separated facial regions are then clustered in a linear discriminant space using similarities in muscular activation patterns. Next, the actual dynamic expression data are mapped onto discriminant features for developing a rule-based expert system that facilitates classifying twenty-one upper and twenty-one lower microexpressions. Invoking the random forest algorithm would classify twenty-one macrolevel facial expressions with 76.11% accuracy. A support vector machine (SVM), used separately on upper and lower facial regions in tandem, could classify them with respective accuracies of 73.63% and 87.68%. This work demonstrates a novel and effective method of dynamic assessment of affective states. The HSCN further demonstrates that facial muscle variations gathered from either upper, lower, or full-face would suffice classifying affective states. We also provide new insight into discovery of microlevel facial muscle variations and their utilization in dynamic assessment of facial expressions of affective states.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Computational Social Systems
IEEE Transactions on Computational Social Systems Social Sciences-Social Sciences (miscellaneous)
CiteScore
10.00
自引率
20.00%
发文量
316
期刊介绍: IEEE Transactions on Computational Social Systems focuses on such topics as modeling, simulation, analysis and understanding of social systems from the quantitative and/or computational perspective. "Systems" include man-man, man-machine and machine-machine organizations and adversarial situations as well as social media structures and their dynamics. More specifically, the proposed transactions publishes articles on modeling the dynamics of social systems, methodologies for incorporating and representing socio-cultural and behavioral aspects in computational modeling, analysis of social system behavior and structure, and paradigms for social systems modeling and simulation. The journal also features articles on social network dynamics, social intelligence and cognition, social systems design and architectures, socio-cultural modeling and representation, and computational behavior modeling, and their applications.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信