使用平淡视频对青少年和年轻人的自闭症特征进行基于凝视的筛查

Karan Ahuja, A. Bose, Mohit Jain, K. Dey, Anil Joshi, K. Achary, Blessin Varkey, Chris Harrison, Mayank Goel
{"title":"使用平淡视频对青少年和年轻人的自闭症特征进行基于凝视的筛查","authors":"Karan Ahuja, A. Bose, Mohit Jain, K. Dey, Anil Joshi, K. Achary, Blessin Varkey, Chris Harrison, Mayank Goel","doi":"10.1145/3378393.3402242","DOIUrl":null,"url":null,"abstract":"Autism Spectrum Disorder (ASD) is a universal and often lifelong neuro-developmental disorder. Individuals with ASD often present comorbidities such as epilepsy, depression, and anxiety. In the United States, in 2014, 1 out of 68 people was affected by autism, but worldwide, the number of affected people drops to 1 in 160. This disparity is primarily due to underdiagnosis and unreported cases in resource-constrained environments. Wiggins et al. 1 found that, in the US, children of color are under-identified with ASD. Missing a diagnosis is not without consequences; approximately 26% of adults with ASD are under-employed, and are under-enrolled in higher education. Unfortunately, ASD diagnosis is not straightforward and involves a subjective assessment of the patient's behavior. Because such assessments can be noisy and even non-existent in low-resource environments, many cases go unidentified. Many such cases remain undiagnosed even when the patient reaches adolescence or adulthood. There is a need for an objective, low-cost, and ubiquitous approach to diagnose ASD. Autism is often characterized by symptoms such as limited interpersonal and social communication skills, and difficulty in face recognition and emotion interpretation. When watching video media, these symptoms can manifest as reduced eye fixation, resulting in characteristic gaze behaviors. Thus, we developed an approach to screen patients with ASD using their gaze behavior while they watch videos on a laptop screen. We used a dedicated eye tracker to record the participant's gaze. With data from 60 participants (35 with ASD and 25 without ASD), our algorithm demonstrates 92.5% classification accuracy after the participants watched 15 seconds of the video. We also developed a proof-of-concept regression model that estimates the severity of the condition and achieves a mean absolute error of 2.03 on the Childhood Autism Rating Scale (CARS). One of the most common approaches to identify individuals with ASD involves studying family home videos and investigating an infant's gaze and interactions with their families. However, having an expert carefully inspect hours of home video is expensive and unscalable. Our approach is more accessible and ubiquitous as we can directly sense the gaze of the user while they watch videos. Such sensing can be directly deployed on billions of smartphones around the world that are equipped with a front-facing camera. In our current exploration, we use a dedicated eye-tracker but achieving similar performance using an unmodified s martphone c amera is not far-fetched. Our results demonstrate that passively tracking a user's gaze pattern while they watch videos on a screen can enable robust identification of individuals with ASD. Past work has used specially-created visual content to detect ASD, but getting large sets of the population to watch specific videos is hard. Thus, we focus on generic content and selected four prosaic video scenes as a proof of concept. Our research team includes experienced psychologists to inform the study design and contextualize the performance of the final system. Although our gaze tracking approach cannot yet replace a clinical assessment, we believe it could be valuable for screening individuals passively, as they consume media content on computing devices (e.g., YouTube, Netflix, in-game cut scenes). We believe our efforts in estimating condition severity is also an essential first step towards building an entirely automated, in-home screening, and condition management tool. With rapid advancements in gaze tracking on consumer devices (e.g., Apple iPhone, HTC Vive), autism detection could be included on modern computing devices as a downloadable app or background feature, and potentially reduce the number of undiagnosed cases. Such a system could also track the efficacy of treatment and interventions. Additionally, ASD detection could be used to automatically adapt user interfaces, which has been shown to improve accessibility.","PeriodicalId":176951,"journal":{"name":"Proceedings of the 3rd ACM SIGCAS Conference on Computing and Sustainable Societies","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Gaze-based Screening of Autistic Traits for Adolescents and Young Adults using Prosaic Videos\",\"authors\":\"Karan Ahuja, A. Bose, Mohit Jain, K. Dey, Anil Joshi, K. Achary, Blessin Varkey, Chris Harrison, Mayank Goel\",\"doi\":\"10.1145/3378393.3402242\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Autism Spectrum Disorder (ASD) is a universal and often lifelong neuro-developmental disorder. Individuals with ASD often present comorbidities such as epilepsy, depression, and anxiety. In the United States, in 2014, 1 out of 68 people was affected by autism, but worldwide, the number of affected people drops to 1 in 160. This disparity is primarily due to underdiagnosis and unreported cases in resource-constrained environments. Wiggins et al. 1 found that, in the US, children of color are under-identified with ASD. Missing a diagnosis is not without consequences; approximately 26% of adults with ASD are under-employed, and are under-enrolled in higher education. Unfortunately, ASD diagnosis is not straightforward and involves a subjective assessment of the patient's behavior. Because such assessments can be noisy and even non-existent in low-resource environments, many cases go unidentified. Many such cases remain undiagnosed even when the patient reaches adolescence or adulthood. There is a need for an objective, low-cost, and ubiquitous approach to diagnose ASD. Autism is often characterized by symptoms such as limited interpersonal and social communication skills, and difficulty in face recognition and emotion interpretation. When watching video media, these symptoms can manifest as reduced eye fixation, resulting in characteristic gaze behaviors. Thus, we developed an approach to screen patients with ASD using their gaze behavior while they watch videos on a laptop screen. We used a dedicated eye tracker to record the participant's gaze. With data from 60 participants (35 with ASD and 25 without ASD), our algorithm demonstrates 92.5% classification accuracy after the participants watched 15 seconds of the video. We also developed a proof-of-concept regression model that estimates the severity of the condition and achieves a mean absolute error of 2.03 on the Childhood Autism Rating Scale (CARS). One of the most common approaches to identify individuals with ASD involves studying family home videos and investigating an infant's gaze and interactions with their families. However, having an expert carefully inspect hours of home video is expensive and unscalable. Our approach is more accessible and ubiquitous as we can directly sense the gaze of the user while they watch videos. Such sensing can be directly deployed on billions of smartphones around the world that are equipped with a front-facing camera. In our current exploration, we use a dedicated eye-tracker but achieving similar performance using an unmodified s martphone c amera is not far-fetched. Our results demonstrate that passively tracking a user's gaze pattern while they watch videos on a screen can enable robust identification of individuals with ASD. Past work has used specially-created visual content to detect ASD, but getting large sets of the population to watch specific videos is hard. Thus, we focus on generic content and selected four prosaic video scenes as a proof of concept. Our research team includes experienced psychologists to inform the study design and contextualize the performance of the final system. Although our gaze tracking approach cannot yet replace a clinical assessment, we believe it could be valuable for screening individuals passively, as they consume media content on computing devices (e.g., YouTube, Netflix, in-game cut scenes). We believe our efforts in estimating condition severity is also an essential first step towards building an entirely automated, in-home screening, and condition management tool. With rapid advancements in gaze tracking on consumer devices (e.g., Apple iPhone, HTC Vive), autism detection could be included on modern computing devices as a downloadable app or background feature, and potentially reduce the number of undiagnosed cases. Such a system could also track the efficacy of treatment and interventions. Additionally, ASD detection could be used to automatically adapt user interfaces, which has been shown to improve accessibility.\",\"PeriodicalId\":176951,\"journal\":{\"name\":\"Proceedings of the 3rd ACM SIGCAS Conference on Computing and Sustainable Societies\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 3rd ACM SIGCAS Conference on Computing and Sustainable Societies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3378393.3402242\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd ACM SIGCAS Conference on Computing and Sustainable Societies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3378393.3402242","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

自闭症谱系障碍(ASD)是一种普遍的、通常是终身的神经发育障碍。患有自闭症谱系障碍的个体通常会出现癫痫、抑郁和焦虑等合并症。2014年,在美国,每68人中就有1人患有自闭症,但在世界范围内,受自闭症影响的人数下降到160人中有1人。这种差异主要是由于在资源有限的环境中诊断不足和未报告病例。威金斯等人1发现,在美国,有色人种儿童被认为患有自闭症。错过诊断并非没有后果;大约26%的自闭症成年人没有充分就业,也没有接受高等教育。不幸的是,自闭症谱系障碍的诊断并不简单,需要对患者的行为进行主观评估。由于在资源匮乏的环境中,这样的评估可能是嘈杂的,甚至是不存在的,因此许多情况无法确定。许多这样的病例甚至在患者进入青春期或成年期后仍未得到诊断。需要一种客观、低成本和普遍的方法来诊断ASD。自闭症通常以人际交往和社会沟通能力有限、面部识别和情绪解释困难等症状为特征。在观看视频媒体时,这些症状可以表现为眼睛注视减少,导致特征性凝视行为。因此,我们开发了一种方法,通过ASD患者在笔记本电脑屏幕上观看视频时的凝视行为来筛查他们。我们使用专用眼动仪记录参与者的目光。使用来自60名参与者(35名患有ASD, 25名没有ASD)的数据,我们的算法在参与者观看15秒视频后显示出92.5%的分类准确率。我们还开发了一个概念验证回归模型来估计病情的严重程度,并在儿童自闭症评定量表(CARS)上实现了2.03的平均绝对误差。识别自闭症患者最常见的方法之一是研究家庭录像,调查婴儿的目光和与家人的互动。然而,请专家仔细检查几个小时的家庭录像是昂贵的,而且不可扩展。我们的方法更容易使用,也更普遍,因为我们可以在用户观看视频时直接感受到他们的目光。这种传感可以直接部署在全球数十亿配备前置摄像头的智能手机上。在我们目前的探索中,我们使用了专用的眼动仪,但使用未经修改的智能手机摄像头实现类似的性能并不遥不可及。我们的研究结果表明,当用户在屏幕上观看视频时,被动地跟踪用户的凝视模式可以有效地识别自闭症患者。过去的工作是使用专门制作的视觉内容来检测自闭症,但要让大量人群观看特定的视频很难。因此,我们将重点放在一般内容上,并选择了四个平凡的视频场景作为概念的证明。我们的研究团队包括经验丰富的心理学家,为研究设计提供信息,并将最终系统的性能置于环境中。虽然我们的注视追踪方法还不能取代临床评估,但我们相信它对于被动地筛选个体是有价值的,因为他们在计算设备上消费媒体内容(例如,YouTube, Netflix,游戏中的过场动画)。我们相信,我们在评估病情严重程度方面的努力也是建立完全自动化的家庭筛查和病情管理工具的重要的第一步。随着消费类设备(如苹果iPhone、HTC Vive)的注视追踪技术的快速发展,自闭症检测可以作为可下载的应用程序或后台功能包含在现代计算设备中,并有可能减少未确诊病例的数量。这样的系统还可以跟踪治疗和干预的效果。此外,ASD检测可用于自动调整用户界面,这已被证明可以提高可访问性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Gaze-based Screening of Autistic Traits for Adolescents and Young Adults using Prosaic Videos
Autism Spectrum Disorder (ASD) is a universal and often lifelong neuro-developmental disorder. Individuals with ASD often present comorbidities such as epilepsy, depression, and anxiety. In the United States, in 2014, 1 out of 68 people was affected by autism, but worldwide, the number of affected people drops to 1 in 160. This disparity is primarily due to underdiagnosis and unreported cases in resource-constrained environments. Wiggins et al. 1 found that, in the US, children of color are under-identified with ASD. Missing a diagnosis is not without consequences; approximately 26% of adults with ASD are under-employed, and are under-enrolled in higher education. Unfortunately, ASD diagnosis is not straightforward and involves a subjective assessment of the patient's behavior. Because such assessments can be noisy and even non-existent in low-resource environments, many cases go unidentified. Many such cases remain undiagnosed even when the patient reaches adolescence or adulthood. There is a need for an objective, low-cost, and ubiquitous approach to diagnose ASD. Autism is often characterized by symptoms such as limited interpersonal and social communication skills, and difficulty in face recognition and emotion interpretation. When watching video media, these symptoms can manifest as reduced eye fixation, resulting in characteristic gaze behaviors. Thus, we developed an approach to screen patients with ASD using their gaze behavior while they watch videos on a laptop screen. We used a dedicated eye tracker to record the participant's gaze. With data from 60 participants (35 with ASD and 25 without ASD), our algorithm demonstrates 92.5% classification accuracy after the participants watched 15 seconds of the video. We also developed a proof-of-concept regression model that estimates the severity of the condition and achieves a mean absolute error of 2.03 on the Childhood Autism Rating Scale (CARS). One of the most common approaches to identify individuals with ASD involves studying family home videos and investigating an infant's gaze and interactions with their families. However, having an expert carefully inspect hours of home video is expensive and unscalable. Our approach is more accessible and ubiquitous as we can directly sense the gaze of the user while they watch videos. Such sensing can be directly deployed on billions of smartphones around the world that are equipped with a front-facing camera. In our current exploration, we use a dedicated eye-tracker but achieving similar performance using an unmodified s martphone c amera is not far-fetched. Our results demonstrate that passively tracking a user's gaze pattern while they watch videos on a screen can enable robust identification of individuals with ASD. Past work has used specially-created visual content to detect ASD, but getting large sets of the population to watch specific videos is hard. Thus, we focus on generic content and selected four prosaic video scenes as a proof of concept. Our research team includes experienced psychologists to inform the study design and contextualize the performance of the final system. Although our gaze tracking approach cannot yet replace a clinical assessment, we believe it could be valuable for screening individuals passively, as they consume media content on computing devices (e.g., YouTube, Netflix, in-game cut scenes). We believe our efforts in estimating condition severity is also an essential first step towards building an entirely automated, in-home screening, and condition management tool. With rapid advancements in gaze tracking on consumer devices (e.g., Apple iPhone, HTC Vive), autism detection could be included on modern computing devices as a downloadable app or background feature, and potentially reduce the number of undiagnosed cases. Such a system could also track the efficacy of treatment and interventions. Additionally, ASD detection could be used to automatically adapt user interfaces, which has been shown to improve accessibility.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信