Automatic Identification of Facial Tics Using Selfie-Video.

IF 6.7 2区 医学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Yocheved Loewenstern, Noa Benaroya-Milshtein, Katya Belelovsky, Izhar Bar-Gad
{"title":"Automatic Identification of Facial Tics Using Selfie-Video.","authors":"Yocheved Loewenstern, Noa Benaroya-Milshtein, Katya Belelovsky, Izhar Bar-Gad","doi":"10.1109/JBHI.2024.3488285","DOIUrl":null,"url":null,"abstract":"<p><p>The intrinsic nature of tic disorders, characterized by symptom variability and fluctuation, poses challenges in clinical evaluations. Currently, tic assessments predominantly rely on subjective questionnaires administered periodically during clinical visits, thus lacking continuous quantitative evaluation. This study aims to establish an automatic objective measure of tic expression in natural behavioral settings. A custom-developed smartphone application was used to record selfie-videos of children and adolescents with tic disorders exhibiting facial motor tics. Facial landmarks were utilized to extract tic-related features from video segments labeled as either \"tic\" or \"non-tic\". These features were then passed through a tandem of custom deep neural networks to learn spatial and temporal properties for tic classification of these segments according to their labels. The model achieved a mean accuracy of 95% when trained on data across all subjects, and consistently exceeded 90% accuracy in leave-one-session-out and leave-one-subject-out cross validation training schemes. This automatic tic identification measure may provide a valuable tool for clinicians in facilitating diagnosis, patient follow-up, and treatment efficacy evaluation. Combining this measure with standard smartphone technology has the potential to revolutionize large-scale clinical studies, thereby expediting the development and testing of novel interventions.</p>","PeriodicalId":13073,"journal":{"name":"IEEE Journal of Biomedical and Health Informatics","volume":"PP ","pages":""},"PeriodicalIF":6.7000,"publicationDate":"2024-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Biomedical and Health Informatics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/JBHI.2024.3488285","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

The intrinsic nature of tic disorders, characterized by symptom variability and fluctuation, poses challenges in clinical evaluations. Currently, tic assessments predominantly rely on subjective questionnaires administered periodically during clinical visits, thus lacking continuous quantitative evaluation. This study aims to establish an automatic objective measure of tic expression in natural behavioral settings. A custom-developed smartphone application was used to record selfie-videos of children and adolescents with tic disorders exhibiting facial motor tics. Facial landmarks were utilized to extract tic-related features from video segments labeled as either "tic" or "non-tic". These features were then passed through a tandem of custom deep neural networks to learn spatial and temporal properties for tic classification of these segments according to their labels. The model achieved a mean accuracy of 95% when trained on data across all subjects, and consistently exceeded 90% accuracy in leave-one-session-out and leave-one-subject-out cross validation training schemes. This automatic tic identification measure may provide a valuable tool for clinicians in facilitating diagnosis, patient follow-up, and treatment efficacy evaluation. Combining this measure with standard smartphone technology has the potential to revolutionize large-scale clinical studies, thereby expediting the development and testing of novel interventions.

利用自拍视频自动识别面部抽搐。
抽搐症的固有特征是症状的多变性和波动性,这给临床评估带来了挑战。目前,抽搐评估主要依赖于临床就诊时定期发放的主观问卷,因此缺乏连续的定量评估。本研究旨在建立一种在自然行为环境中自动客观测量抽搐表现的方法。研究人员使用定制开发的智能手机应用程序录制抽动障碍儿童和青少年面部运动抽动的自拍视频。利用面部地标从标记为 "抽搐 "或 "非抽搐 "的视频片段中提取与抽搐相关的特征。然后,这些特征通过串联的定制深度神经网络学习空间和时间属性,以便根据标签对这些片段进行抽搐分类。在对所有受试者的数据进行训练时,该模型的平均准确率达到了 95%,并且在 "单个会话淘汰 "和 "单个受试者淘汰 "交叉验证训练计划中的准确率始终超过 90%。这种抽搐自动识别方法可为临床医生提供一种宝贵的工具,有助于诊断、患者随访和疗效评估。将这种测量方法与标准智能手机技术相结合,有可能彻底改变大规模临床研究,从而加快新型干预措施的开发和测试。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Journal of Biomedical and Health Informatics
IEEE Journal of Biomedical and Health Informatics COMPUTER SCIENCE, INFORMATION SYSTEMS-COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
CiteScore
13.60
自引率
6.50%
发文量
1151
期刊介绍: IEEE Journal of Biomedical and Health Informatics publishes original papers presenting recent advances where information and communication technologies intersect with health, healthcare, life sciences, and biomedicine. Topics include acquisition, transmission, storage, retrieval, management, and analysis of biomedical and health information. The journal covers applications of information technologies in healthcare, patient monitoring, preventive care, early disease diagnosis, therapy discovery, and personalized treatment protocols. It explores electronic medical and health records, clinical information systems, decision support systems, medical and biological imaging informatics, wearable systems, body area/sensor networks, and more. Integration-related topics like interoperability, evidence-based medicine, and secure patient data are also addressed.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信