Motion Prediction Based on sEMG- Transformer for Lower Limb Exoskeleton Robot Control

Min Zeng, J. Gu, Ying Feng
{"title":"Motion Prediction Based on sEMG- Transformer for Lower Limb Exoskeleton Robot Control","authors":"Min Zeng, J. Gu, Ying Feng","doi":"10.1109/ICARM58088.2023.10218920","DOIUrl":null,"url":null,"abstract":"While lower limb exoskeleton robots can realize assisted walking by extracting the user's motion intention, it is difficult to effectively obtain the motion intention of the human body and convert it into information for the robot. In this paper, a novel model based on sEMG-Transformer is proposed for continuous motion prediction of multiple joint angles of lower limbs and applied to the developed lower limb exoskeleton robot. The sEMG-Transformer model can be used to extract the time series features from the sEMG sequences and establish the mapping between input data and multi-joint angle. Then, the predicted multi-joint angles are inputted into the developed lower limb exoskeleton robot. Experimental studies are performed with able-bodied human wearers and compared to existing methods, such as convolutional neural network (CNN), back propagation (BP), and long short-term memory (LSTM) networks. The motion intention estimation based on the sEMG-Transformer network has better estimation performance, which can effectively enable users to walk synchronously with the lower limb exoskeleton robot.","PeriodicalId":220013,"journal":{"name":"2023 International Conference on Advanced Robotics and Mechatronics (ICARM)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Advanced Robotics and Mechatronics (ICARM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICARM58088.2023.10218920","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

While lower limb exoskeleton robots can realize assisted walking by extracting the user's motion intention, it is difficult to effectively obtain the motion intention of the human body and convert it into information for the robot. In this paper, a novel model based on sEMG-Transformer is proposed for continuous motion prediction of multiple joint angles of lower limbs and applied to the developed lower limb exoskeleton robot. The sEMG-Transformer model can be used to extract the time series features from the sEMG sequences and establish the mapping between input data and multi-joint angle. Then, the predicted multi-joint angles are inputted into the developed lower limb exoskeleton robot. Experimental studies are performed with able-bodied human wearers and compared to existing methods, such as convolutional neural network (CNN), back propagation (BP), and long short-term memory (LSTM) networks. The motion intention estimation based on the sEMG-Transformer network has better estimation performance, which can effectively enable users to walk synchronously with the lower limb exoskeleton robot.
基于表面肌电信号变压器的下肢外骨骼机器人运动预测控制
虽然下肢外骨骼机器人可以通过提取用户的运动意图来实现辅助行走,但很难有效地获取人体的运动意图并将其转化为机器人的信息。本文提出了一种基于表面肌电信号变压器的下肢多关节角度连续运动预测模型,并将其应用于已研制的下肢外骨骼机器人。sEMG- transformer模型可以从sEMG序列中提取时间序列特征,建立输入数据与多关节角度之间的映射关系。然后,将预测的多关节角度输入到所开发的下肢外骨骼机器人中。实验研究是在身体健全的人身上进行的,并与现有的方法进行了比较,如卷积神经网络(CNN)、反向传播(BP)和长短期记忆(LSTM)网络。基于sEMG-Transformer网络的运动意图估计具有较好的估计性能,可以有效地实现用户与下肢外骨骼机器人的同步行走。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信