解锁步态分析超越步态实验室:高保真膝关节运动学复制使用惯性运动单元和卷积神经网络

IF 2.1 Q3 ORTHOPEDICS
Stefano A. Bini MD , Nicholas Gillian PhD , Thomas A. Peterson PhD , Richard B. Souza PhD, PT , Brooke Schultz MS, ACE-CPT , Wojciech Mormul MS , Marek K. Cichoń MS , Agnieszka Barbara Szczotka MS , Ivan Poupyrev PhD
{"title":"解锁步态分析超越步态实验室:高保真膝关节运动学复制使用惯性运动单元和卷积神经网络","authors":"Stefano A. Bini MD ,&nbsp;Nicholas Gillian PhD ,&nbsp;Thomas A. Peterson PhD ,&nbsp;Richard B. Souza PhD, PT ,&nbsp;Brooke Schultz MS, ACE-CPT ,&nbsp;Wojciech Mormul MS ,&nbsp;Marek K. Cichoń MS ,&nbsp;Agnieszka Barbara Szczotka MS ,&nbsp;Ivan Poupyrev PhD","doi":"10.1016/j.artd.2025.101656","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Gait analysis using three-dimensional motion capture systems (3D motion capture) provides a combination of kinematic and kinetic measurements for quantifying and characterizing the motion and loads, respectively, of lower extremity joints during human movement. However, their high cost and limited accessibility impact their utility. Wearable inertial motion sensors offer a cost-effective alternative to measure simple temporospatial variables, but more complex kinematic variables require machine learning interfaces. We hypothesize that kinematic measures about the knee collected using motion capture can be replicated by coupling raw data collected from inertial measurement units (IMUs) to machine learning algorithms.</div></div><div><h3>Methods</h3><div>Data from 40 healthy participants performing fixed walking, stair climbing, and sit-to-stand tasks were collected using both 3D motion capture and IMUs. Sequence to sequence convolutional neural networks were trained to map IMU data to three motion capture kinematic outputs: right knee angle, right knee angular velocity, and right hip angle. Model performance was assessed using mean absolute error.</div></div><div><h3>Results</h3><div>The convolutional neural network models exhibited high accuracy in replicating motion capture-derived kinematic variables. Mean absolute error values for right knee angle ranged from 4.30 ± 1.55 to 5.79 ± 2.93 degrees, for right knee angular velocity from 7.82 ± 3.01 to 22.16 ± 9.52 degrees per second, and for right hip angle from 4.82 ± 2.29 to 8.63 ± 4.73 degrees. Task-specific variations in accuracy were observed.</div></div><div><h3>Conclusions</h3><div>The findings highlight the potential of leveraging raw data from wearable inertial sensors and machine learning algorithms to reproduce gait lab-quality kinematic data outside the laboratory settings for the study of knee function following joint injury, surgery, or the progression of joint disease.</div></div>","PeriodicalId":37940,"journal":{"name":"Arthroplasty Today","volume":"33 ","pages":"Article 101656"},"PeriodicalIF":2.1000,"publicationDate":"2025-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Unlocking Gait Analysis Beyond the Gait Lab: High-Fidelity Replication of Knee Kinematics Using Inertial Motion Units and a Convolutional Neural Network\",\"authors\":\"Stefano A. Bini MD ,&nbsp;Nicholas Gillian PhD ,&nbsp;Thomas A. Peterson PhD ,&nbsp;Richard B. Souza PhD, PT ,&nbsp;Brooke Schultz MS, ACE-CPT ,&nbsp;Wojciech Mormul MS ,&nbsp;Marek K. Cichoń MS ,&nbsp;Agnieszka Barbara Szczotka MS ,&nbsp;Ivan Poupyrev PhD\",\"doi\":\"10.1016/j.artd.2025.101656\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Background</h3><div>Gait analysis using three-dimensional motion capture systems (3D motion capture) provides a combination of kinematic and kinetic measurements for quantifying and characterizing the motion and loads, respectively, of lower extremity joints during human movement. However, their high cost and limited accessibility impact their utility. Wearable inertial motion sensors offer a cost-effective alternative to measure simple temporospatial variables, but more complex kinematic variables require machine learning interfaces. We hypothesize that kinematic measures about the knee collected using motion capture can be replicated by coupling raw data collected from inertial measurement units (IMUs) to machine learning algorithms.</div></div><div><h3>Methods</h3><div>Data from 40 healthy participants performing fixed walking, stair climbing, and sit-to-stand tasks were collected using both 3D motion capture and IMUs. Sequence to sequence convolutional neural networks were trained to map IMU data to three motion capture kinematic outputs: right knee angle, right knee angular velocity, and right hip angle. Model performance was assessed using mean absolute error.</div></div><div><h3>Results</h3><div>The convolutional neural network models exhibited high accuracy in replicating motion capture-derived kinematic variables. Mean absolute error values for right knee angle ranged from 4.30 ± 1.55 to 5.79 ± 2.93 degrees, for right knee angular velocity from 7.82 ± 3.01 to 22.16 ± 9.52 degrees per second, and for right hip angle from 4.82 ± 2.29 to 8.63 ± 4.73 degrees. Task-specific variations in accuracy were observed.</div></div><div><h3>Conclusions</h3><div>The findings highlight the potential of leveraging raw data from wearable inertial sensors and machine learning algorithms to reproduce gait lab-quality kinematic data outside the laboratory settings for the study of knee function following joint injury, surgery, or the progression of joint disease.</div></div>\",\"PeriodicalId\":37940,\"journal\":{\"name\":\"Arthroplasty Today\",\"volume\":\"33 \",\"pages\":\"Article 101656\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2025-04-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Arthroplasty Today\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2352344125000433\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ORTHOPEDICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Arthroplasty Today","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352344125000433","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ORTHOPEDICS","Score":null,"Total":0}
引用次数: 0

摘要

使用三维运动捕捉系统(3D运动捕捉)的步态分析提供了运动学和动力学测量的组合,用于量化和表征人类运动过程中下肢关节的运动和负荷。然而,它们的高成本和有限的可访问性影响了它们的效用。可穿戴惯性运动传感器为测量简单的时空变量提供了一种经济有效的替代方案,但更复杂的运动变量需要机器学习接口。我们假设,通过将从惯性测量单元(imu)收集的原始数据与机器学习算法相结合,可以复制使用运动捕捉收集的膝关节运动学测量。方法使用3D运动捕捉和imu收集40名健康受试者固定行走、爬楼梯和坐立任务的数据。序列对序列卷积神经网络进行训练,将IMU数据映射到三个运动捕获运动学输出:右膝关节角度、右膝关节角速度和右髋关节角度。使用平均绝对误差评估模型性能。结果卷积神经网络模型对运动捕捉衍生的运动学变量具有较高的复制精度。右膝角的平均绝对误差范围为4.30±1.55 ~ 5.79±2.93度,右膝角速度的平均绝对误差范围为7.82±3.01 ~ 22.16±9.52度/秒,右臀角的平均绝对误差范围为4.82±2.29 ~ 8.63±4.73度。观察到任务特定的准确性差异。研究结果强调了利用可穿戴惯性传感器和机器学习算法的原始数据在实验室环境之外重现实验室质量的步态运动学数据的潜力,这些数据可用于研究关节损伤、手术或关节疾病进展后的膝关节功能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Unlocking Gait Analysis Beyond the Gait Lab: High-Fidelity Replication of Knee Kinematics Using Inertial Motion Units and a Convolutional Neural Network

Background

Gait analysis using three-dimensional motion capture systems (3D motion capture) provides a combination of kinematic and kinetic measurements for quantifying and characterizing the motion and loads, respectively, of lower extremity joints during human movement. However, their high cost and limited accessibility impact their utility. Wearable inertial motion sensors offer a cost-effective alternative to measure simple temporospatial variables, but more complex kinematic variables require machine learning interfaces. We hypothesize that kinematic measures about the knee collected using motion capture can be replicated by coupling raw data collected from inertial measurement units (IMUs) to machine learning algorithms.

Methods

Data from 40 healthy participants performing fixed walking, stair climbing, and sit-to-stand tasks were collected using both 3D motion capture and IMUs. Sequence to sequence convolutional neural networks were trained to map IMU data to three motion capture kinematic outputs: right knee angle, right knee angular velocity, and right hip angle. Model performance was assessed using mean absolute error.

Results

The convolutional neural network models exhibited high accuracy in replicating motion capture-derived kinematic variables. Mean absolute error values for right knee angle ranged from 4.30 ± 1.55 to 5.79 ± 2.93 degrees, for right knee angular velocity from 7.82 ± 3.01 to 22.16 ± 9.52 degrees per second, and for right hip angle from 4.82 ± 2.29 to 8.63 ± 4.73 degrees. Task-specific variations in accuracy were observed.

Conclusions

The findings highlight the potential of leveraging raw data from wearable inertial sensors and machine learning algorithms to reproduce gait lab-quality kinematic data outside the laboratory settings for the study of knee function following joint injury, surgery, or the progression of joint disease.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Arthroplasty Today
Arthroplasty Today Medicine-Surgery
CiteScore
2.90
自引率
0.00%
发文量
258
审稿时长
40 weeks
期刊介绍: Arthroplasty Today is a companion journal to the Journal of Arthroplasty. The journal Arthroplasty Today brings together the clinical and scientific foundations for joint replacement of the hip and knee in an open-access, online format. Arthroplasty Today solicits manuscripts of the highest quality from all areas of scientific endeavor that relate to joint replacement or the treatment of its complications, including those dealing with patient outcomes, economic and policy issues, prosthetic design, biomechanics, biomaterials, and biologic response to arthroplasty. The journal focuses on case reports. It is the purpose of Arthroplasty Today to present material to practicing orthopaedic surgeons that will keep them abreast of developments in the field, prove useful in the care of patients, and aid in understanding the scientific foundation of this subspecialty area of joint replacement. The international members of the Editorial Board provide a worldwide perspective for the journal''s area of interest. Their participation ensures that each issue of Arthroplasty Today provides the reader with timely, peer-reviewed articles of the highest quality.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信