Natalie G Wall, Oliver Smith, Linda Campbell, Carmel Loughland, Ulrich Schall
{"title":"Using EEG and Eye Tracking to Evaluate an Emotion Recognition iPad App for Autistic Children.","authors":"Natalie G Wall, Oliver Smith, Linda Campbell, Carmel Loughland, Ulrich Schall","doi":"10.1177/15500594251362402","DOIUrl":null,"url":null,"abstract":"<p><p>Autism is a neurodevelopmental condition that impacts individuals' communication and social interaction skills. Autistic children often have smaller N170 amplitudes in response to faces than neurotypical children. Autistic children also avoid the salient areas of the face. Technology-based interventions have been developed to teach autistic children how to recognise facial expressions, but the results have exhibited considerable variability across studies. The current study explored the effectiveness of an iPad app designed to support autistic children in recognising facial expressions by examining how participants process facial information through event-related potentials (ERP) and eye-tracking recordings. ERPs and eye tracking were recorded from 20 neurotypical and 15 autistic children aged between 6 and 12 years. The results replicated previous work, with the autistic group having smaller N170 and Vertex Positive Potential amplitudes and more scan time off the face when compared to non-autistic children. Following the intervention, some changes were observed in facial feature scanning among autistic participants, characterised by increased time spent on the face and decreased fixations. These findings add to the work, indicating that eye tracking may be a valuable biomarker for intervention outcomes in autism. Further research into N170 as a biomarker is needed.</p>","PeriodicalId":93940,"journal":{"name":"Clinical EEG and neuroscience","volume":" ","pages":"15500594251362402"},"PeriodicalIF":1.7000,"publicationDate":"2025-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Clinical EEG and neuroscience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/15500594251362402","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Autism is a neurodevelopmental condition that impacts individuals' communication and social interaction skills. Autistic children often have smaller N170 amplitudes in response to faces than neurotypical children. Autistic children also avoid the salient areas of the face. Technology-based interventions have been developed to teach autistic children how to recognise facial expressions, but the results have exhibited considerable variability across studies. The current study explored the effectiveness of an iPad app designed to support autistic children in recognising facial expressions by examining how participants process facial information through event-related potentials (ERP) and eye-tracking recordings. ERPs and eye tracking were recorded from 20 neurotypical and 15 autistic children aged between 6 and 12 years. The results replicated previous work, with the autistic group having smaller N170 and Vertex Positive Potential amplitudes and more scan time off the face when compared to non-autistic children. Following the intervention, some changes were observed in facial feature scanning among autistic participants, characterised by increased time spent on the face and decreased fixations. These findings add to the work, indicating that eye tracking may be a valuable biomarker for intervention outcomes in autism. Further research into N170 as a biomarker is needed.