{"title":"Multi-View Facial Expressions Analysis of Autistic Children in Social Play","authors":"Jiabei Zeng;Yujian Yuan;Lu Qu;Fei Chang;Xuran Sun;Jinqiuyu Gong;Xuling Han;Min Liu;Hang Zhao;Qiaoyun Liu;Shiguang Shan;Xilin Chen","doi":"10.1109/TAFFC.2025.3557458","DOIUrl":null,"url":null,"abstract":"Atypical facial expressions during interaction are among the early symptoms of autism spectrum disorder (ASD) and are included in standard diagnostic assessments. However, current methods rely on subjective human judgments, introducing bias and limiting objectivity. This paper proposes an automated framework for objective and quantitative assessment of autistic children’s facial expressions during social play. Initially, we utilize four synchronized cameras to record interactions between ASD children and teachers during structured activities dominated by the teacher. To address challenges posed by head movements and occluded faces, we introduce a multi-view facial expression recognition strategy. Its effectiveness is demonstrated by experiments in real-world applications. To quantify the patterns of affect status and the dynamic complexity of facial expressions, we use the temporally accumulated distribution of the basic facial expressions and the multi-dimensional multi-scale entropy of the facial expression sequence. Analysis of these features revealed significant differences between ASD and TD groups. Experimental results, derived from our quantified features, confirm conclusions drawn from previous research and experiential observations. With these facial expression features, ASD and typically developing (TD) children are accurately classified (accuracy 92.1%, precision 94.4%, 89.5% sensitivity, 94.7% specificity) in empirical experiments, suggesting the potential of our framework for improved ASD assessment.","PeriodicalId":13131,"journal":{"name":"IEEE Transactions on Affective Computing","volume":"16 3","pages":"2200-2214"},"PeriodicalIF":9.8000,"publicationDate":"2025-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Affective Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10948192/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Atypical facial expressions during interaction are among the early symptoms of autism spectrum disorder (ASD) and are included in standard diagnostic assessments. However, current methods rely on subjective human judgments, introducing bias and limiting objectivity. This paper proposes an automated framework for objective and quantitative assessment of autistic children’s facial expressions during social play. Initially, we utilize four synchronized cameras to record interactions between ASD children and teachers during structured activities dominated by the teacher. To address challenges posed by head movements and occluded faces, we introduce a multi-view facial expression recognition strategy. Its effectiveness is demonstrated by experiments in real-world applications. To quantify the patterns of affect status and the dynamic complexity of facial expressions, we use the temporally accumulated distribution of the basic facial expressions and the multi-dimensional multi-scale entropy of the facial expression sequence. Analysis of these features revealed significant differences between ASD and TD groups. Experimental results, derived from our quantified features, confirm conclusions drawn from previous research and experiential observations. With these facial expression features, ASD and typically developing (TD) children are accurately classified (accuracy 92.1%, precision 94.4%, 89.5% sensitivity, 94.7% specificity) in empirical experiments, suggesting the potential of our framework for improved ASD assessment.
期刊介绍:
The IEEE Transactions on Affective Computing is an international and interdisciplinary journal. Its primary goal is to share research findings on the development of systems capable of recognizing, interpreting, and simulating human emotions and related affective phenomena. The journal publishes original research on the underlying principles and theories that explain how and why affective factors shape human-technology interactions. It also focuses on how techniques for sensing and simulating affect can enhance our understanding of human emotions and processes. Additionally, the journal explores the design, implementation, and evaluation of systems that prioritize the consideration of affect in their usability. We also welcome surveys of existing work that provide new perspectives on the historical and future directions of this field.