学生竞赛(科技创新) ID 1972228

IF 2.4 Q1 REHABILITATION
Nicholas Zhao, J. Zariffa
{"title":"学生竞赛(科技创新) ID 1972228","authors":"Nicholas Zhao, J. Zariffa","doi":"10.46292/sci23-1972228s","DOIUrl":null,"url":null,"abstract":"Upper limb rehabilitation after cervical spinal cord injury is vital for regaining independence. Hand function assessments are critical for upper limb rehabilitation, but are unable to accurately reflect performance in the individual’s home environment. Video from wearable cameras (egocentric video), paired with deep learning, can assess hand function during activities of daily living (ADLs) at home. Previous studies have focused on analyzing quantitative aspects of hand usage, but there has yet to be a study that uses deep neural networks to assess the quality of hand usage from egocentric video. To train a deep neural network to predict hand function assessment scores from egocentric video. The dataset used contained egocentric videos of ADLs performed by 17 participants with AIS grades from A-D in a home simulation laboratory. Tasks were annotated with scores adapted from the Graded Redefined Assessment of Strength Sensibility and Prehension (GRASSP). The annotated video was then used to train and validate a SlowFast neural network to predict GRASSP scores, using leave-one-subject-out cross validation. Model performance was evaluated by mean absolute error, accuracy, and F1 score. The model was optimized with a hyperparameter sweep. The top performing model demonstrated a mean absolute error of 0.52±0.19, an accuracy of 0.55±0.14, and F1 score of 0.55±0.16, on an ordinal scale from 1 to 5. These results demonstrate that automated assessment of hand function is possible by applying deep learning to egocentric video. Future work should expand the model to larger datasets with more variability.","PeriodicalId":46769,"journal":{"name":"Topics in Spinal Cord Injury Rehabilitation","volume":null,"pages":null},"PeriodicalIF":2.4000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Student Competition (Technology Innovation) ID 1972228\",\"authors\":\"Nicholas Zhao, J. Zariffa\",\"doi\":\"10.46292/sci23-1972228s\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Upper limb rehabilitation after cervical spinal cord injury is vital for regaining independence. Hand function assessments are critical for upper limb rehabilitation, but are unable to accurately reflect performance in the individual’s home environment. Video from wearable cameras (egocentric video), paired with deep learning, can assess hand function during activities of daily living (ADLs) at home. Previous studies have focused on analyzing quantitative aspects of hand usage, but there has yet to be a study that uses deep neural networks to assess the quality of hand usage from egocentric video. To train a deep neural network to predict hand function assessment scores from egocentric video. The dataset used contained egocentric videos of ADLs performed by 17 participants with AIS grades from A-D in a home simulation laboratory. Tasks were annotated with scores adapted from the Graded Redefined Assessment of Strength Sensibility and Prehension (GRASSP). The annotated video was then used to train and validate a SlowFast neural network to predict GRASSP scores, using leave-one-subject-out cross validation. Model performance was evaluated by mean absolute error, accuracy, and F1 score. The model was optimized with a hyperparameter sweep. The top performing model demonstrated a mean absolute error of 0.52±0.19, an accuracy of 0.55±0.14, and F1 score of 0.55±0.16, on an ordinal scale from 1 to 5. These results demonstrate that automated assessment of hand function is possible by applying deep learning to egocentric video. Future work should expand the model to larger datasets with more variability.\",\"PeriodicalId\":46769,\"journal\":{\"name\":\"Topics in Spinal Cord Injury Rehabilitation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.4000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Topics in Spinal Cord Injury Rehabilitation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.46292/sci23-1972228s\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"REHABILITATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Topics in Spinal Cord Injury Rehabilitation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.46292/sci23-1972228s","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REHABILITATION","Score":null,"Total":0}
引用次数: 0

摘要

颈部脊髓损伤后的上肢康复对于恢复自理能力至关重要。手部功能评估对上肢康复至关重要,但无法准确反映个人在家庭环境中的表现。来自可穿戴摄像头的视频(以自我为中心的视频)与深度学习相结合,可以评估在家中进行日常生活活动(ADL)时的手部功能。以往的研究侧重于分析手部使用的定量方面,但尚未有研究利用深度神经网络来评估自我中心视频中手部使用的质量。 目的是训练一个深度神经网络,以预测来自自我中心视频的手部功能评估得分。 使用的数据集包含 17 名 AIS 等级为 A-D 的参与者在家庭模拟实验室中进行 ADL 的自我中心视频。这些任务都标注了改编自力量感受性和预知能力分级评估(GRASSP)的分数。然后,利用注释视频对 SlowFast 神经网络进行训练和验证,以预测 GRASSP 分数。模型性能通过平均绝对误差、准确率和 F1 分数进行评估。通过超参数扫描对模型进行了优化。 表现最好的模型的平均绝对误差为 0.52±0.19,准确度为 0.55±0.14,F1 分数为 0.55±0.16(从 1 到 5)。 这些结果表明,通过将深度学习应用于以自我为中心的视频,可以实现手部功能的自动评估。未来的工作应将该模型扩展到具有更大变异性的更大数据集。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Student Competition (Technology Innovation) ID 1972228
Upper limb rehabilitation after cervical spinal cord injury is vital for regaining independence. Hand function assessments are critical for upper limb rehabilitation, but are unable to accurately reflect performance in the individual’s home environment. Video from wearable cameras (egocentric video), paired with deep learning, can assess hand function during activities of daily living (ADLs) at home. Previous studies have focused on analyzing quantitative aspects of hand usage, but there has yet to be a study that uses deep neural networks to assess the quality of hand usage from egocentric video. To train a deep neural network to predict hand function assessment scores from egocentric video. The dataset used contained egocentric videos of ADLs performed by 17 participants with AIS grades from A-D in a home simulation laboratory. Tasks were annotated with scores adapted from the Graded Redefined Assessment of Strength Sensibility and Prehension (GRASSP). The annotated video was then used to train and validate a SlowFast neural network to predict GRASSP scores, using leave-one-subject-out cross validation. Model performance was evaluated by mean absolute error, accuracy, and F1 score. The model was optimized with a hyperparameter sweep. The top performing model demonstrated a mean absolute error of 0.52±0.19, an accuracy of 0.55±0.14, and F1 score of 0.55±0.16, on an ordinal scale from 1 to 5. These results demonstrate that automated assessment of hand function is possible by applying deep learning to egocentric video. Future work should expand the model to larger datasets with more variability.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.20
自引率
3.40%
发文量
33
期刊介绍: Now in our 22nd year as the leading interdisciplinary journal of SCI rehabilitation techniques and care. TSCIR is peer-reviewed, practical, and features one key topic per issue. Published topics include: mobility, sexuality, genitourinary, functional assessment, skin care, psychosocial, high tetraplegia, physical activity, pediatric, FES, sci/tbi, electronic medicine, orthotics, secondary conditions, research, aging, legal issues, women & sci, pain, environmental effects, life care planning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信