Remote Inference of Cognitive Scores in ALS Patients Using a Picture Description

C. Agurto, G. Cecchi, Bo Wen, E. Fraenkel, James D. Berry, I. Navar, R. Norel
{"title":"Remote Inference of Cognitive Scores in ALS Patients Using a Picture Description","authors":"C. Agurto, G. Cecchi, Bo Wen, E. Fraenkel, James D. Berry, I. Navar, R. Norel","doi":"10.1109/ICDH60066.2023.00017","DOIUrl":null,"url":null,"abstract":"Amyotrophic lateral sclerosis (ALS) is a fatal disease that affects not only movement, speech, and breathing but also cognition. Recent studies have focused on the use of language analysis techniques to detect ALS and infer scales for monitoring functional progression. This paper focused on another important aspect, cognitive impairment, which affects 35-50% of the ALS population. In an effort to reach the ALS population, which frequently exhibits mobility limitations, we implemented the digital version of the Edinburgh Cognitive and Behavioral ALS Screen (ECAS) test for the first time. This test, designed to measure cognitive impairment, was remotely performed by 56 participants from the EverythingALS Speech Study1. As part of the study, participants (ALS and non-ALS) were asked to describe weekly one picture from a pool of many pictures with complex scenes displayed on their computer at home. We analyze the descriptions performed within +/− 60 days from the day the ECAS test was administered and extract different types of linguistic and acoustic features. We input those features into linear regression models to infer 5 ECAS sub-scores and the total score. Speech samples from the picture description are reliable enough to predict the ECAS subs-scores, achieving statistically significant Spearman correlation values between 0.32 and 0.51 for the model’s performance using 10-fold cross-validation.1https://www.everythingals.org/research","PeriodicalId":107307,"journal":{"name":"2023 IEEE International Conference on Digital Health (ICDH)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Digital Health (ICDH)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDH60066.2023.00017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Amyotrophic lateral sclerosis (ALS) is a fatal disease that affects not only movement, speech, and breathing but also cognition. Recent studies have focused on the use of language analysis techniques to detect ALS and infer scales for monitoring functional progression. This paper focused on another important aspect, cognitive impairment, which affects 35-50% of the ALS population. In an effort to reach the ALS population, which frequently exhibits mobility limitations, we implemented the digital version of the Edinburgh Cognitive and Behavioral ALS Screen (ECAS) test for the first time. This test, designed to measure cognitive impairment, was remotely performed by 56 participants from the EverythingALS Speech Study1. As part of the study, participants (ALS and non-ALS) were asked to describe weekly one picture from a pool of many pictures with complex scenes displayed on their computer at home. We analyze the descriptions performed within +/− 60 days from the day the ECAS test was administered and extract different types of linguistic and acoustic features. We input those features into linear regression models to infer 5 ECAS sub-scores and the total score. Speech samples from the picture description are reliable enough to predict the ECAS subs-scores, achieving statistically significant Spearman correlation values between 0.32 and 0.51 for the model’s performance using 10-fold cross-validation.1https://www.everythingals.org/research
使用图片描述的ALS患者认知评分的远程推断
肌萎缩性侧索硬化症(ALS)是一种致命的疾病,不仅影响运动、语言和呼吸,而且影响认知。最近的研究集中在使用语言分析技术来检测ALS和推断监测功能进展的量表。本文关注的是另一个重要方面,即影响35-50% ALS人群的认知障碍。为了接触到经常表现出行动能力限制的ALS人群,我们首次实施了爱丁堡认知和行为ALS筛查(ECAS)测试的数字版本。这项测试旨在测量认知障碍,由56名来自“一切言语研究”的参与者远程执行。作为研究的一部分,参与者(肌萎缩侧索硬化症和非肌萎缩侧索硬化症)被要求每周从家中电脑上显示的许多复杂场景的图片中描述一张图片。我们分析了从ECAS测试进行之日起+/−60天内的描述,并提取了不同类型的语言和声学特征。我们将这些特征输入到线性回归模型中,推断出5个ECAS分值和总分。来自图片描述的语音样本足够可靠,可以预测ECAS子分数,使用10倍交叉验证,模型性能的Spearman相关值在0.32和0.51之间具有统计学意义。1https://www.everythingals.org/research
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信