{"title":"Motion Prediction Based on sEMG- Transformer for Lower Limb Exoskeleton Robot Control","authors":"Min Zeng, J. Gu, Ying Feng","doi":"10.1109/ICARM58088.2023.10218920","DOIUrl":null,"url":null,"abstract":"While lower limb exoskeleton robots can realize assisted walking by extracting the user's motion intention, it is difficult to effectively obtain the motion intention of the human body and convert it into information for the robot. In this paper, a novel model based on sEMG-Transformer is proposed for continuous motion prediction of multiple joint angles of lower limbs and applied to the developed lower limb exoskeleton robot. The sEMG-Transformer model can be used to extract the time series features from the sEMG sequences and establish the mapping between input data and multi-joint angle. Then, the predicted multi-joint angles are inputted into the developed lower limb exoskeleton robot. Experimental studies are performed with able-bodied human wearers and compared to existing methods, such as convolutional neural network (CNN), back propagation (BP), and long short-term memory (LSTM) networks. The motion intention estimation based on the sEMG-Transformer network has better estimation performance, which can effectively enable users to walk synchronously with the lower limb exoskeleton robot.","PeriodicalId":220013,"journal":{"name":"2023 International Conference on Advanced Robotics and Mechatronics (ICARM)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Advanced Robotics and Mechatronics (ICARM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICARM58088.2023.10218920","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
While lower limb exoskeleton robots can realize assisted walking by extracting the user's motion intention, it is difficult to effectively obtain the motion intention of the human body and convert it into information for the robot. In this paper, a novel model based on sEMG-Transformer is proposed for continuous motion prediction of multiple joint angles of lower limbs and applied to the developed lower limb exoskeleton robot. The sEMG-Transformer model can be used to extract the time series features from the sEMG sequences and establish the mapping between input data and multi-joint angle. Then, the predicted multi-joint angles are inputted into the developed lower limb exoskeleton robot. Experimental studies are performed with able-bodied human wearers and compared to existing methods, such as convolutional neural network (CNN), back propagation (BP), and long short-term memory (LSTM) networks. The motion intention estimation based on the sEMG-Transformer network has better estimation performance, which can effectively enable users to walk synchronously with the lower limb exoskeleton robot.