ALGRNet: Multi-Relational Adaptive Facial Action Unit Modelling for Face Representation and Relevant Recognitions

Xuri Ge;Joemon M. Jose;Pengcheng Wang;Arunachalam Iyer;Xiao Liu;Hu Han
{"title":"ALGRNet: Multi-Relational Adaptive Facial Action Unit Modelling for Face Representation and Relevant Recognitions","authors":"Xuri Ge;Joemon M. Jose;Pengcheng Wang;Arunachalam Iyer;Xiao Liu;Hu Han","doi":"10.1109/TBIOM.2023.3306810","DOIUrl":null,"url":null,"abstract":"Facial action units (AUs) represent the fundamental activities of a group of muscles, exhibiting subtle changes that are useful for various face analysis tasks. One practical application in real-life situations is the automatic estimation of facial paralysis. This involves analyzing the delicate changes in facial muscle regions and skin textures. It seems logical to assess the severity of facial paralysis by combining well-defined muscle regions (similar to AUs) symmetrically, thus creating a comprehensive facial representation. To this end, we have developed a new model to estimate the severity of facial paralysis automatically and is inspired by the facial action units (FAU) recognition that deals with rich, detailed facial appearance information, such as texture, muscle status, etc. Specifically, a novel Adaptive Local-Global Relational Network (ALGRNet) is designed to adaptively mine the context of well-defined facial muscles and enhance the visual details of facial appearance and texture, which can be flexibly adapted to facial-based tasks, e.g., FAU recognition and facial paralysis estimation. ALGRNet consists of three key structures: (i) an adaptive region learning module that identifies high-potential muscle response regions, (ii) a skip-BiLSTM that models the latent relationships among local regions, enabling better correlation between multiple regional lesion muscles and texture changes, and (iii) a feature fusion&refining module that explores the complementarity between the local and global aspects of the face. We have extensively evaluated ALGRNet to demonstrate its effectiveness using two widely recognized AU benchmarks, BP4D and DISFA. Furthermore, to assess the efficacy of FAUs in subsequent applications, we have investigated their application in the identification of facial paralysis. Experimental findings obtained from a facial paralysis benchmark, meticulously gathered and annotated by medical experts, underscore the potential of utilizing identified AU attributes to estimate the severity of facial paralysis.","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"5 4","pages":"566-578"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on biometrics, behavior, and identity science","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10225375/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Facial action units (AUs) represent the fundamental activities of a group of muscles, exhibiting subtle changes that are useful for various face analysis tasks. One practical application in real-life situations is the automatic estimation of facial paralysis. This involves analyzing the delicate changes in facial muscle regions and skin textures. It seems logical to assess the severity of facial paralysis by combining well-defined muscle regions (similar to AUs) symmetrically, thus creating a comprehensive facial representation. To this end, we have developed a new model to estimate the severity of facial paralysis automatically and is inspired by the facial action units (FAU) recognition that deals with rich, detailed facial appearance information, such as texture, muscle status, etc. Specifically, a novel Adaptive Local-Global Relational Network (ALGRNet) is designed to adaptively mine the context of well-defined facial muscles and enhance the visual details of facial appearance and texture, which can be flexibly adapted to facial-based tasks, e.g., FAU recognition and facial paralysis estimation. ALGRNet consists of three key structures: (i) an adaptive region learning module that identifies high-potential muscle response regions, (ii) a skip-BiLSTM that models the latent relationships among local regions, enabling better correlation between multiple regional lesion muscles and texture changes, and (iii) a feature fusion&refining module that explores the complementarity between the local and global aspects of the face. We have extensively evaluated ALGRNet to demonstrate its effectiveness using two widely recognized AU benchmarks, BP4D and DISFA. Furthermore, to assess the efficacy of FAUs in subsequent applications, we have investigated their application in the identification of facial paralysis. Experimental findings obtained from a facial paralysis benchmark, meticulously gathered and annotated by medical experts, underscore the potential of utilizing identified AU attributes to estimate the severity of facial paralysis.
ALGRNet:多关系自适应面部动作单元建模,用于面部表征和相关识别
面部动作单位(AUs)代表一组肌肉的基本活动,表现出对各种面部分析任务有用的细微变化。在现实生活中的一个实际应用是自动估计面瘫。这包括分析面部肌肉区域和皮肤纹理的微妙变化。通过对称地组合定义明确的肌肉区域(类似于AUs)来评估面瘫的严重程度似乎是合乎逻辑的,从而创建一个全面的面部表征。为此,我们开发了一种新的模型来自动估计面瘫的严重程度,并受到面部动作单元(FAU)识别的启发,该识别处理丰富、详细的面部外观信息,如纹理、肌肉状态等。具体而言,设计了一种新的自适应局部-全局关系网络(ALGRNet),自适应挖掘定义良好的面部肌肉上下文,增强面部外观和纹理的视觉细节,可以灵活地适应基于面部的任务,例如FAU识别和面瘫估计。ALGRNet由三个关键结构组成:(i)识别高电位肌肉反应区域的自适应区域学习模块,(ii)对局部区域之间的潜在关系进行建模的跳跳- bilstm,使多个区域病变肌肉和纹理变化之间更好地相关,(iii)特征融合和精炼模块,探索面部局部和全局方面之间的互补性。我们对ALGRNet进行了广泛的评估,以证明其使用两个广泛认可的AU基准BP4D和DISFA的有效性。此外,为了评估FAUs在后续应用中的有效性,我们研究了FAUs在识别面瘫中的应用。从面瘫基准中获得的实验结果,由医学专家精心收集和注释,强调了利用已识别的AU属性来估计面瘫严重程度的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
10.90
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信