Liora Manelis-Baram, Tal Barami, Michal Ilan, Gal Meiri, Idan Menashe, Elizabeth Soskin, Carmel Sofer, Ilan Dinstein
{"title":"Comparing three algorithms of automated facial expression analysis in autistic children: different sensitivities but consistent proportions.","authors":"Liora Manelis-Baram, Tal Barami, Michal Ilan, Gal Meiri, Idan Menashe, Elizabeth Soskin, Carmel Sofer, Ilan Dinstein","doi":"10.1186/s13229-025-00685-x","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Difficulties with non-verbal communication, including atypical use of facial expressions, are a core feature of autism. Quantifying atypical use of facial expressions during naturalistic social interactions in a reliable, objective, and direct manner is difficult, but potentially possible with facial analysis computer vision algorithms that identify facial expressions in video recordings.</p><p><strong>Methods: </strong>We analyzed > 5 million video frames from 100 verbal children, 2-7 years-old (72 with autism and 28 controls), who were recorded during a ~ 45-minute ADOS-2 assessment using modules 2 or 3, where they interacted with a clinician. Three different facial analysis algorithms (iMotions, FaceReader, and Py-Feat) were used to identify the presence of six facial expressions (anger, fear, sadness, surprise, disgust, and happiness) in each video frame. We then compared results across algorithms and across autism and control groups using robust non-parametric statistical tests.</p><p><strong>Results: </strong>There were significant differences in the performance of the three facial analysis algorithms including differences in the proportion of frames identified as containing a face and frames classified as containing each of the six examined facial expressions. Nevertheless, analyses across all three algorithms demonstrated that there were no significant differences in the quantity of any facial expression produced by children with autism and controls. Furthermore, the quantity of facial expressions did not correlate with autism symptom severity as measured by ADOS-2 CSS scores.</p><p><strong>Limitations: </strong>The current findings are limited to verbal children with autism who completed ADOS-2 assessments using modules 2 and 3 and were able to sit in a stable manner while facing a wall-mounted camera. Furthermore, the analyses focused on comparing the quantity of facial expressions across groups rather than their quality, timing, or social context.</p><p><strong>Conclusions: </strong>Commonly used automated facial analysis algorithms exhibit large variability in their output when identifying facial expressions of young children during naturalistic social interactions. Nonetheless, all three algorithms did not identify differences in the quantity of facial expressions across groups, suggesting that atypical production of facial expressions in verbal children with autism is likely related to their quality, timing, and social context rather than their quantity during natural social interaction.</p>","PeriodicalId":18733,"journal":{"name":"Molecular Autism","volume":"16 1","pages":"50"},"PeriodicalIF":5.5000,"publicationDate":"2025-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Molecular Autism","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1186/s13229-025-00685-x","RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GENETICS & HEREDITY","Score":null,"Total":0}
引用次数: 0
Abstract
Background: Difficulties with non-verbal communication, including atypical use of facial expressions, are a core feature of autism. Quantifying atypical use of facial expressions during naturalistic social interactions in a reliable, objective, and direct manner is difficult, but potentially possible with facial analysis computer vision algorithms that identify facial expressions in video recordings.
Methods: We analyzed > 5 million video frames from 100 verbal children, 2-7 years-old (72 with autism and 28 controls), who were recorded during a ~ 45-minute ADOS-2 assessment using modules 2 or 3, where they interacted with a clinician. Three different facial analysis algorithms (iMotions, FaceReader, and Py-Feat) were used to identify the presence of six facial expressions (anger, fear, sadness, surprise, disgust, and happiness) in each video frame. We then compared results across algorithms and across autism and control groups using robust non-parametric statistical tests.
Results: There were significant differences in the performance of the three facial analysis algorithms including differences in the proportion of frames identified as containing a face and frames classified as containing each of the six examined facial expressions. Nevertheless, analyses across all three algorithms demonstrated that there were no significant differences in the quantity of any facial expression produced by children with autism and controls. Furthermore, the quantity of facial expressions did not correlate with autism symptom severity as measured by ADOS-2 CSS scores.
Limitations: The current findings are limited to verbal children with autism who completed ADOS-2 assessments using modules 2 and 3 and were able to sit in a stable manner while facing a wall-mounted camera. Furthermore, the analyses focused on comparing the quantity of facial expressions across groups rather than their quality, timing, or social context.
Conclusions: Commonly used automated facial analysis algorithms exhibit large variability in their output when identifying facial expressions of young children during naturalistic social interactions. Nonetheless, all three algorithms did not identify differences in the quantity of facial expressions across groups, suggesting that atypical production of facial expressions in verbal children with autism is likely related to their quality, timing, and social context rather than their quantity during natural social interaction.
期刊介绍:
Molecular Autism is a peer-reviewed, open access journal that publishes high-quality basic, translational and clinical research that has relevance to the etiology, pathobiology, or treatment of autism and related neurodevelopmental conditions. Research that includes integration across levels is encouraged. Molecular Autism publishes empirical studies, reviews, and brief communications.