Ground Reaction Force and Moment Estimation through EMG Sensing Using Long Short-Term Memory Network during Posture Coordination.

IF 10.5 Q1 ENGINEERING, BIOMEDICAL
Sei-Ichi Sakamoto, Yonatan Hutabarat, Dai Owaki, Mitsuhiro Hayashibe
{"title":"Ground Reaction Force and Moment Estimation through EMG Sensing Using Long Short-Term Memory Network during Posture Coordination.","authors":"Sei-Ichi Sakamoto,&nbsp;Yonatan Hutabarat,&nbsp;Dai Owaki,&nbsp;Mitsuhiro Hayashibe","doi":"10.34133/cbsystems.0016","DOIUrl":null,"url":null,"abstract":"<p><p>Motion prediction based on kinematic information such as body segment displacement and joint angle has been widely studied. Because motions originate from forces, it is beneficial to estimate dynamic information, such as the ground reaction force (GRF), in addition to kinematic information for advanced motion prediction. In this study, we proposed a method to estimate GRF and ground reaction moment (GRM) from electromyography (EMG) in combination with and without an inertial measurement unit (IMU) sensor using a machine learning technique. A long short-term memory network, which is suitable for processing long time-span data, was constructed with EMG and IMU as input data to estimate GRF during posture control and stepping motion. The results demonstrate that the proposed method can provide the GRF estimation with a root mean square error (RMSE) of 8.22 ± 0.97% (mean ± SE) for the posture control motion and 11.17 ± 2.16% (mean ± SE) for the stepping motion. We could confirm that EMG input is essential especially when we need to predict both GRF and GRM with limited numbers of sensors attached under knees. In addition, we developed a GRF visualization system integrated with ongoing motion in a Unity environment. This system enabled the visualization of the GRF vector in 3-dimensional space and provides predictive motion direction based on the estimated GRF, which can be useful for human motion prediction with portable sensors.</p>","PeriodicalId":72764,"journal":{"name":"Cyborg and bionic systems (Washington, D.C.)","volume":"4 ","pages":"0016"},"PeriodicalIF":10.5000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10044327/pdf/","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cyborg and bionic systems (Washington, D.C.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.34133/cbsystems.0016","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 2

Abstract

Motion prediction based on kinematic information such as body segment displacement and joint angle has been widely studied. Because motions originate from forces, it is beneficial to estimate dynamic information, such as the ground reaction force (GRF), in addition to kinematic information for advanced motion prediction. In this study, we proposed a method to estimate GRF and ground reaction moment (GRM) from electromyography (EMG) in combination with and without an inertial measurement unit (IMU) sensor using a machine learning technique. A long short-term memory network, which is suitable for processing long time-span data, was constructed with EMG and IMU as input data to estimate GRF during posture control and stepping motion. The results demonstrate that the proposed method can provide the GRF estimation with a root mean square error (RMSE) of 8.22 ± 0.97% (mean ± SE) for the posture control motion and 11.17 ± 2.16% (mean ± SE) for the stepping motion. We could confirm that EMG input is essential especially when we need to predict both GRF and GRM with limited numbers of sensors attached under knees. In addition, we developed a GRF visualization system integrated with ongoing motion in a Unity environment. This system enabled the visualization of the GRF vector in 3-dimensional space and provides predictive motion direction based on the estimated GRF, which can be useful for human motion prediction with portable sensors.

Abstract Image

Abstract Image

Abstract Image

基于长短期记忆网络的地面反作用力和力矩估计。
基于体段位移和关节角度等运动学信息的运动预测得到了广泛的研究。由于运动源于力,因此除了运动学信息外,还有助于估计动态信息,如地面反作用力(GRF),以进行高级运动预测。在这项研究中,我们提出了一种使用机器学习技术,在结合或不使用惯性测量单元(IMU)传感器的情况下,从肌电图(EMG)估计GRF和地面反应力矩(GRM)的方法。以EMG和IMU为输入数据,构建了适合处理长时间跨度数据的长短期记忆网络,用于估计姿态控制和步进运动时的GRF。结果表明,该方法对姿态控制运动和步进运动的GRF估计均方根误差(RMSE)分别为8.22±0.97%和11.17±2.16%。我们可以确认肌电输入是必不可少的,特别是当我们需要在膝盖下连接有限数量的传感器时预测GRF和GRM时。此外,我们开发了一个GRF可视化系统,在Unity环境中集成了正在进行的运动。该系统实现了GRF矢量在三维空间的可视化,并基于估计的GRF提供预测运动方向,可用于便携式传感器的人体运动预测。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.70
自引率
0.00%
发文量
0
审稿时长
21 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信