{"title":"Estimating lower extremity joint angles during gait using reduced number of sensors count via deep learning","authors":"M. Hossain, Hwan Choi, Zhishan Guo","doi":"10.1117/12.2643786","DOIUrl":null,"url":null,"abstract":"Estimating lower extremity joint angle during gait is essential for biomechanical analysis and clinical purposes. Traditionally infrared light-based motion capture systems are used to get the joint angle information. However, such an approach is restricted to the lab environment, limiting the applicability of the method in daily living. Inertial Measurement Units (IMU) sensors can solve this limitation but are needed in each body segment, causing discomfort and impracticality in everyday living. As a result, it is desirable to build a system that can measure joint angles in daily living while ensuring user comfort. For this reason, this paper uses deep learning to estimate joint angle during gait using only two IMU sensors mounted on participants' shoes under four different walking conditions, i.e., treadmill, overground, stair, and slope. Specifically, we leverage Gated Recurrent Unit (GRU), 1D, and 2D convolutional layers to create sub-networks and take their average to get a final model in an end-to-end manner. Extensive evaluations are done on the proposed method, which outperforms the baseline and improves the Root Mean Square Error (RMSE) of joint angle prediction by up to 32.96%.","PeriodicalId":314555,"journal":{"name":"International Conference on Digital Image Processing","volume":"78 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Digital Image Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2643786","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Estimating lower extremity joint angle during gait is essential for biomechanical analysis and clinical purposes. Traditionally infrared light-based motion capture systems are used to get the joint angle information. However, such an approach is restricted to the lab environment, limiting the applicability of the method in daily living. Inertial Measurement Units (IMU) sensors can solve this limitation but are needed in each body segment, causing discomfort and impracticality in everyday living. As a result, it is desirable to build a system that can measure joint angles in daily living while ensuring user comfort. For this reason, this paper uses deep learning to estimate joint angle during gait using only two IMU sensors mounted on participants' shoes under four different walking conditions, i.e., treadmill, overground, stair, and slope. Specifically, we leverage Gated Recurrent Unit (GRU), 1D, and 2D convolutional layers to create sub-networks and take their average to get a final model in an end-to-end manner. Extensive evaluations are done on the proposed method, which outperforms the baseline and improves the Root Mean Square Error (RMSE) of joint angle prediction by up to 32.96%.