Smartphone-based gaze estimation for in-home autism research

IF 5.3 2区 医学 Q1 BEHAVIORAL SCIENCES
Autism Research Pub Date : 2024-04-25 DOI:10.1002/aur.3140
Na Yeon Kim, Junfeng He, Qianying Wu, Na Dai, Kai Kohlhoff, Jasmin Turner, Lynn K. Paul, Daniel P. Kennedy, Ralph Adolphs, Vidhya Navalpakkam
{"title":"Smartphone-based gaze estimation for in-home autism research","authors":"Na Yeon Kim,&nbsp;Junfeng He,&nbsp;Qianying Wu,&nbsp;Na Dai,&nbsp;Kai Kohlhoff,&nbsp;Jasmin Turner,&nbsp;Lynn K. Paul,&nbsp;Daniel P. Kennedy,&nbsp;Ralph Adolphs,&nbsp;Vidhya Navalpakkam","doi":"10.1002/aur.3140","DOIUrl":null,"url":null,"abstract":"<p>Atypical gaze patterns are a promising biomarker of autism spectrum disorder. To measure gaze accurately, however, it typically requires highly controlled studies in the laboratory using specialized equipment that is often expensive, thereby limiting the scalability of these approaches. Here we test whether a recently developed smartphone-based gaze estimation method could overcome such limitations and take advantage of the ubiquity of smartphones. As a proof-of-principle, we measured gaze while a small sample of well-assessed autistic participants and controls watched videos on a smartphone, both in the laboratory (with lab personnel) and in remote home settings (alone). We demonstrate that gaze data can be efficiently collected, in-home and longitudinally by participants themselves, with sufficiently high accuracy (gaze estimation error below 1° visual angle on average) for quantitative, feature-based analysis. Using this approach, we show that autistic individuals have reduced gaze time on human faces and longer gaze time on non-social features in the background, thereby reproducing established findings in autism using just smartphones and no additional hardware. Our approach provides a foundation for scaling future research with larger and more representative participant groups at vastly reduced cost, also enabling better inclusion of underserved communities.</p>","PeriodicalId":131,"journal":{"name":"Autism Research","volume":null,"pages":null},"PeriodicalIF":5.3000,"publicationDate":"2024-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/aur.3140","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Autism Research","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/aur.3140","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BEHAVIORAL SCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Atypical gaze patterns are a promising biomarker of autism spectrum disorder. To measure gaze accurately, however, it typically requires highly controlled studies in the laboratory using specialized equipment that is often expensive, thereby limiting the scalability of these approaches. Here we test whether a recently developed smartphone-based gaze estimation method could overcome such limitations and take advantage of the ubiquity of smartphones. As a proof-of-principle, we measured gaze while a small sample of well-assessed autistic participants and controls watched videos on a smartphone, both in the laboratory (with lab personnel) and in remote home settings (alone). We demonstrate that gaze data can be efficiently collected, in-home and longitudinally by participants themselves, with sufficiently high accuracy (gaze estimation error below 1° visual angle on average) for quantitative, feature-based analysis. Using this approach, we show that autistic individuals have reduced gaze time on human faces and longer gaze time on non-social features in the background, thereby reproducing established findings in autism using just smartphones and no additional hardware. Our approach provides a foundation for scaling future research with larger and more representative participant groups at vastly reduced cost, also enabling better inclusion of underserved communities.

Abstract Image

基于智能手机的凝视估算,用于居家自闭症研究。
非典型凝视模式是自闭症谱系障碍的一种有希望的生物标志物。然而,要精确测量凝视,通常需要在实验室使用昂贵的专业设备进行高度控制研究,从而限制了这些方法的可扩展性。在此,我们测试了最近开发的一种基于智能手机的凝视估计方法能否克服这些限制,并利用智能手机无处不在的优势。作为原理验证,我们在实验室(与实验室人员一起)和远程家庭环境(单独)中测量了一小部分评估良好的自闭症参与者和对照组在智能手机上观看视频时的注视情况。我们证明,可以在家中由参与者自己有效地纵向收集凝视数据,并且具有足够高的准确性(凝视估计误差平均低于 1° 视角),可用于基于特征的定量分析。通过使用这种方法,我们发现自闭症患者注视人脸的时间减少,而注视背景中非社交特征的时间延长,从而再现了仅使用智能手机而无需额外硬件的自闭症研究结果。我们的研究方法为今后在更大范围、更具代表性的参与群体中开展研究奠定了基础,同时大大降低了成本,还能更好地将得不到充分服务的群体纳入研究范围。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Autism Research
Autism Research 医学-行为科学
CiteScore
8.00
自引率
8.50%
发文量
187
审稿时长
>12 weeks
期刊介绍: AUTISM RESEARCH will cover the developmental disorders known as Pervasive Developmental Disorders (or autism spectrum disorders – ASDs). The Journal focuses on basic genetic, neurobiological and psychological mechanisms and how these influence developmental processes in ASDs.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信