你的转弯:帕金森病严重程度评估的在家转弯角度估计

IF 6.2 2区 医学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Qiushuo Cheng , Catherine Morgan , Arindam Sikdar , Alessandro Masullo , Alan Whone , Majid Mirmehdi
{"title":"你的转弯:帕金森病严重程度评估的在家转弯角度估计","authors":"Qiushuo Cheng ,&nbsp;Catherine Morgan ,&nbsp;Arindam Sikdar ,&nbsp;Alessandro Masullo ,&nbsp;Alan Whone ,&nbsp;Majid Mirmehdi","doi":"10.1016/j.artmed.2025.103194","DOIUrl":null,"url":null,"abstract":"<div><div>People with Parkinson’s Disease (PD) often experience progressively worsening gait, including changes in how they turn around, as the disease progresses. Existing clinical rating tools are not capable of capturing hour-by-hour variations of PD symptoms, as they are confined to brief assessments within clinic settings, leaving gait performance outside these controlled environments unaccounted for. Measuring turning angles continuously and passively is a component step towards using gait characteristics as sensitive indicators of disease progression in PD. This paper presents a deep learning-based approach to automatically quantify turning angles by extracting 3D skeletons from videos and calculating the rotation of hip and knee joints. We utilise advanced human pose estimation models, Fastpose and Strided Transformer, on a total of 1386 turning video clips from 24 subjects (12 people with PD and 12 healthy control volunteers), trimmed from a PD dataset of unscripted free-living videos in a home-like setting (Turn-REMAP). We also curate a turning video dataset, Turn-H3.6M, from the public Human3.6M human pose benchmark with 3D groundtruth, to further validate our method. Previous gait research has primarily taken place in clinics or laboratories evaluating scripted gait outcomes, but this work focuses on free-living home settings where complexities exist, such as baggy clothing and poor lighting. Due to difficulties in obtaining accurate groundtruth data in a free-living setting, we quantise the angle into the nearest bin 45° based on the manual labelling of expert clinicians. Our method achieves a turning calculation accuracy of 41.6%, a Mean Absolute Error (MAE) of 34.7°, and a weighted precision (WPrec) of 68.3% for Turn-REMAP. On Turn-H3.6M, it achieves an accuracy of 73.5%, an MAE of 18.5°, and a WPrec of 86.2%. This is the first work to explore the use of single monocular camera data to quantify turns by PD patients in a home setting. All data and models are publicly available, providing a baseline for turning parameter measurement to promote future PD gait research.</div></div>","PeriodicalId":55458,"journal":{"name":"Artificial Intelligence in Medicine","volume":"167 ","pages":"Article 103194"},"PeriodicalIF":6.2000,"publicationDate":"2025-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Your turn: At home turning angle estimation for Parkinson’s disease severity assessment\",\"authors\":\"Qiushuo Cheng ,&nbsp;Catherine Morgan ,&nbsp;Arindam Sikdar ,&nbsp;Alessandro Masullo ,&nbsp;Alan Whone ,&nbsp;Majid Mirmehdi\",\"doi\":\"10.1016/j.artmed.2025.103194\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>People with Parkinson’s Disease (PD) often experience progressively worsening gait, including changes in how they turn around, as the disease progresses. Existing clinical rating tools are not capable of capturing hour-by-hour variations of PD symptoms, as they are confined to brief assessments within clinic settings, leaving gait performance outside these controlled environments unaccounted for. Measuring turning angles continuously and passively is a component step towards using gait characteristics as sensitive indicators of disease progression in PD. This paper presents a deep learning-based approach to automatically quantify turning angles by extracting 3D skeletons from videos and calculating the rotation of hip and knee joints. We utilise advanced human pose estimation models, Fastpose and Strided Transformer, on a total of 1386 turning video clips from 24 subjects (12 people with PD and 12 healthy control volunteers), trimmed from a PD dataset of unscripted free-living videos in a home-like setting (Turn-REMAP). We also curate a turning video dataset, Turn-H3.6M, from the public Human3.6M human pose benchmark with 3D groundtruth, to further validate our method. Previous gait research has primarily taken place in clinics or laboratories evaluating scripted gait outcomes, but this work focuses on free-living home settings where complexities exist, such as baggy clothing and poor lighting. Due to difficulties in obtaining accurate groundtruth data in a free-living setting, we quantise the angle into the nearest bin 45° based on the manual labelling of expert clinicians. Our method achieves a turning calculation accuracy of 41.6%, a Mean Absolute Error (MAE) of 34.7°, and a weighted precision (WPrec) of 68.3% for Turn-REMAP. On Turn-H3.6M, it achieves an accuracy of 73.5%, an MAE of 18.5°, and a WPrec of 86.2%. This is the first work to explore the use of single monocular camera data to quantify turns by PD patients in a home setting. All data and models are publicly available, providing a baseline for turning parameter measurement to promote future PD gait research.</div></div>\",\"PeriodicalId\":55458,\"journal\":{\"name\":\"Artificial Intelligence in Medicine\",\"volume\":\"167 \",\"pages\":\"Article 103194\"},\"PeriodicalIF\":6.2000,\"publicationDate\":\"2025-06-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Artificial Intelligence in Medicine\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0933365725001290\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence in Medicine","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0933365725001290","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

帕金森氏症(PD)患者通常会经历逐渐恶化的步态,包括转身方式的变化,随着疾病的进展。现有的临床评估工具无法捕捉PD症状每小时的变化,因为它们仅限于临床环境中的简短评估,而在这些受控环境之外的步态表现无法解释。连续被动地测量转弯角度是将步态特征作为PD疾病进展敏感指标的重要一步。本文提出了一种基于深度学习的方法,通过从视频中提取3D骨架并计算髋关节和膝关节的旋转来自动量化转角。我们利用先进的人体姿势估计模型,Fastpose和Strided Transformer,对来自24名受试者(12名PD患者和12名健康对照志愿者)的1386个转身视频剪辑进行了分析,这些视频剪辑来自PD数据集,这些数据集是在家庭环境中无脚本的自由生活视频(turning - remap)。为了进一步验证我们的方法,我们还从公开的Human3.6M人体姿势基准中提取了一个转身视频数据集Turn-H3.6M。先前的步态研究主要是在诊所或实验室评估脚本步态结果,但这项工作侧重于复杂性存在的自由生活家庭环境,如宽松的衣服和昏暗的照明。由于在自由生活环境中难以获得准确的地面真实数据,我们根据专家临床医生的手动标记将角度量化为最近的45°。该方法对Turn-REMAP的转弯计算精度为41.6%,平均绝对误差(MAE)为34.7°,加权精度(WPrec)为68.3%。在Turn-H3.6M上,准确率为73.5%,MAE为18.5°,WPrec为86.2%。这是第一个探索使用单目相机数据来量化PD患者在家庭环境中的转身的工作。所有数据和模型都是公开的,为转弯参数的测量提供了基线,以促进未来PD步态的研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Your turn: At home turning angle estimation for Parkinson’s disease severity assessment
People with Parkinson’s Disease (PD) often experience progressively worsening gait, including changes in how they turn around, as the disease progresses. Existing clinical rating tools are not capable of capturing hour-by-hour variations of PD symptoms, as they are confined to brief assessments within clinic settings, leaving gait performance outside these controlled environments unaccounted for. Measuring turning angles continuously and passively is a component step towards using gait characteristics as sensitive indicators of disease progression in PD. This paper presents a deep learning-based approach to automatically quantify turning angles by extracting 3D skeletons from videos and calculating the rotation of hip and knee joints. We utilise advanced human pose estimation models, Fastpose and Strided Transformer, on a total of 1386 turning video clips from 24 subjects (12 people with PD and 12 healthy control volunteers), trimmed from a PD dataset of unscripted free-living videos in a home-like setting (Turn-REMAP). We also curate a turning video dataset, Turn-H3.6M, from the public Human3.6M human pose benchmark with 3D groundtruth, to further validate our method. Previous gait research has primarily taken place in clinics or laboratories evaluating scripted gait outcomes, but this work focuses on free-living home settings where complexities exist, such as baggy clothing and poor lighting. Due to difficulties in obtaining accurate groundtruth data in a free-living setting, we quantise the angle into the nearest bin 45° based on the manual labelling of expert clinicians. Our method achieves a turning calculation accuracy of 41.6%, a Mean Absolute Error (MAE) of 34.7°, and a weighted precision (WPrec) of 68.3% for Turn-REMAP. On Turn-H3.6M, it achieves an accuracy of 73.5%, an MAE of 18.5°, and a WPrec of 86.2%. This is the first work to explore the use of single monocular camera data to quantify turns by PD patients in a home setting. All data and models are publicly available, providing a baseline for turning parameter measurement to promote future PD gait research.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Artificial Intelligence in Medicine
Artificial Intelligence in Medicine 工程技术-工程:生物医学
CiteScore
15.00
自引率
2.70%
发文量
143
审稿时长
6.3 months
期刊介绍: Artificial Intelligence in Medicine publishes original articles from a wide variety of interdisciplinary perspectives concerning the theory and practice of artificial intelligence (AI) in medicine, medically-oriented human biology, and health care. Artificial intelligence in medicine may be characterized as the scientific discipline pertaining to research studies, projects, and applications that aim at supporting decision-based medical tasks through knowledge- and/or data-intensive computer-based solutions that ultimately support and improve the performance of a human care provider.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信