跑步时矢状足和胫骨角度的一致性来源于一个开源的无标记运动捕捉平台和手动数字化。

IF 1.1 4区 医学 Q4 ENGINEERING, BIOMEDICAL
C. Johnson, J. Outerleys, I. Davis
{"title":"跑步时矢状足和胫骨角度的一致性来源于一个开源的无标记运动捕捉平台和手动数字化。","authors":"C. Johnson, J. Outerleys, I. Davis","doi":"10.1123/jab.2021-0323","DOIUrl":null,"url":null,"abstract":"Several open-source platforms for markerless motion capture offer the ability to track 2-dimensional (2D) kinematics using simple digital video cameras. We sought to establish the performance of one of these platforms, DeepLabCut. Eighty-four runners who had sagittal plane videos recorded of their left lower leg were included in the study. Data from 50 participants were used to train a deep neural network for 2D pose estimation of the foot and tibia segments. The trained model was used to process novel videos from 34 participants for continuous 2D coordinate data. Overall network accuracy was assessed using the train/test errors. Foot and tibia angles were calculated for 7 strides using manual digitization and markerless methods. Agreement was assessed with mean absolute differences and intraclass correlation coefficients. Bland-Altman plots and paired t tests were used to assess systematic bias. The train/test errors for the trained network were 2.87/7.79 pixels, respectively (0.5/1.2 cm). Compared to manual digitization, the markerless method was found to systematically overestimate foot angles and underestimate tibial angles (P < .01, d = 0.06-0.26). However, excellent agreement was found between the segment calculation methods, with mean differences ≤1° and intraclass correlation coefficients ≥.90. Overall, these results demonstrate that open-source, markerless methods are a promising new tool for analyzing human motion.","PeriodicalId":54883,"journal":{"name":"Journal of Applied Biomechanics","volume":"1 1","pages":"1-6"},"PeriodicalIF":1.1000,"publicationDate":"2022-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Agreement Between Sagittal Foot and Tibia Angles During Running Derived From an Open-Source Markerless Motion Capture Platform and Manual Digitization.\",\"authors\":\"C. Johnson, J. Outerleys, I. Davis\",\"doi\":\"10.1123/jab.2021-0323\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Several open-source platforms for markerless motion capture offer the ability to track 2-dimensional (2D) kinematics using simple digital video cameras. We sought to establish the performance of one of these platforms, DeepLabCut. Eighty-four runners who had sagittal plane videos recorded of their left lower leg were included in the study. Data from 50 participants were used to train a deep neural network for 2D pose estimation of the foot and tibia segments. The trained model was used to process novel videos from 34 participants for continuous 2D coordinate data. Overall network accuracy was assessed using the train/test errors. Foot and tibia angles were calculated for 7 strides using manual digitization and markerless methods. Agreement was assessed with mean absolute differences and intraclass correlation coefficients. Bland-Altman plots and paired t tests were used to assess systematic bias. The train/test errors for the trained network were 2.87/7.79 pixels, respectively (0.5/1.2 cm). Compared to manual digitization, the markerless method was found to systematically overestimate foot angles and underestimate tibial angles (P < .01, d = 0.06-0.26). However, excellent agreement was found between the segment calculation methods, with mean differences ≤1° and intraclass correlation coefficients ≥.90. Overall, these results demonstrate that open-source, markerless methods are a promising new tool for analyzing human motion.\",\"PeriodicalId\":54883,\"journal\":{\"name\":\"Journal of Applied Biomechanics\",\"volume\":\"1 1\",\"pages\":\"1-6\"},\"PeriodicalIF\":1.1000,\"publicationDate\":\"2022-03-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Biomechanics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1123/jab.2021-0323\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Biomechanics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1123/jab.2021-0323","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 3

摘要

一些无标记运动捕捉的开源平台提供了使用简单的数字摄像机跟踪二维(2D)运动的能力。我们试图建立其中一个平台DeepLabCut的性能。84名跑步者被纳入了这项研究,他们的左小腿被录下了矢状面视频。来自50名参与者的数据被用来训练一个深度神经网络,用于脚和胫骨段的二维姿态估计。利用训练好的模型对34名参与者的新视频进行连续二维坐标数据处理。使用训练/测试误差评估整体网络精度。采用人工数字化和无标记方法计算7步的足、胫骨角。用平均绝对差和类内相关系数评估一致性。采用Bland-Altman图和配对t检验评估系统偏倚。训练网络的训练/测试误差分别为2.87/7.79像素(0.5/1.2 cm)。与手工数字化相比,无标记方法系统性地高估足角,低估胫骨角(P < 0.01, d = 0.06-0.26)。然而,区段计算方法之间的一致性很好,平均差异≤1°,类内相关系数≥0.90。总的来说,这些结果表明,开源、无标记的方法是一种很有前途的分析人体运动的新工具。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Agreement Between Sagittal Foot and Tibia Angles During Running Derived From an Open-Source Markerless Motion Capture Platform and Manual Digitization.
Several open-source platforms for markerless motion capture offer the ability to track 2-dimensional (2D) kinematics using simple digital video cameras. We sought to establish the performance of one of these platforms, DeepLabCut. Eighty-four runners who had sagittal plane videos recorded of their left lower leg were included in the study. Data from 50 participants were used to train a deep neural network for 2D pose estimation of the foot and tibia segments. The trained model was used to process novel videos from 34 participants for continuous 2D coordinate data. Overall network accuracy was assessed using the train/test errors. Foot and tibia angles were calculated for 7 strides using manual digitization and markerless methods. Agreement was assessed with mean absolute differences and intraclass correlation coefficients. Bland-Altman plots and paired t tests were used to assess systematic bias. The train/test errors for the trained network were 2.87/7.79 pixels, respectively (0.5/1.2 cm). Compared to manual digitization, the markerless method was found to systematically overestimate foot angles and underestimate tibial angles (P < .01, d = 0.06-0.26). However, excellent agreement was found between the segment calculation methods, with mean differences ≤1° and intraclass correlation coefficients ≥.90. Overall, these results demonstrate that open-source, markerless methods are a promising new tool for analyzing human motion.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Applied Biomechanics
Journal of Applied Biomechanics 医学-工程:生物医学
CiteScore
2.00
自引率
0.00%
发文量
47
审稿时长
6-12 weeks
期刊介绍: The mission of the Journal of Applied Biomechanics (JAB) is to disseminate the highest quality peer-reviewed studies that utilize biomechanical strategies to advance the study of human movement. Areas of interest include clinical biomechanics, gait and posture mechanics, musculoskeletal and neuromuscular biomechanics, sport mechanics, and biomechanical modeling. Studies of sport performance that explicitly generalize to broader activities, contribute substantially to fundamental understanding of human motion, or are in a sport that enjoys wide participation, are welcome. Also within the scope of JAB are studies using biomechanical strategies to investigate the structure, control, function, and state (health and disease) of animals.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信