Karl Vincent G. Castillo, N. Mendoza, Chelsea Andrea S. Morales, Allen Dominic A. Perez, Jansen Yna L. Unisa, A. Cruz
{"title":"Denoising of Spatiotemporal Gait Waveforms from Motion-Sensing Depth Camera using Least Mean Square (LMS) Adaptive Filter","authors":"Karl Vincent G. Castillo, N. Mendoza, Chelsea Andrea S. Morales, Allen Dominic A. Perez, Jansen Yna L. Unisa, A. Cruz","doi":"10.1109/HNICEM.2018.8666294","DOIUrl":null,"url":null,"abstract":"Quantitative gait assessment is made possible through the use of instruments such as 3D motion capture systems (Mo-Cap) and its cheaper counterpart, an RGB-D camera. The cost-effectiveness of the RGB-D camera makes it a more practical instrument to use in most clinical settings but it is not as accurate as the Mo-Cap. This paper presents denoising of gait waveforms obtained from the most common parameters produced by RGB-D camera using LMS adaptive filter. The data used came from 14 study volunteers whose normal walking gait were recorded using VICON cameras and Microsoft Kinect v2 sensors. Spatiotemporal gait parameters were calculated from the two gait waveforms. The adaptive filter was trained using training dataset to create a filter model that was then used for the testing phase. Two given data sets, unfiltered and filtered gait parameters, were compared to the motion capture system gait parameters using statistical tools. Unfiltered parameters from the RGB-D camera exhibit significant difference with the Mo-Cap parameters at an average percent error of 26.83%. By calculating statistical values such as mean, standard deviation and root mean square error (RMSE), the findings confirmed that filtered parameter data improved at an average of 11.34% and there is a reduction of rejected parameters by 6 using t-test.","PeriodicalId":426103,"journal":{"name":"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HNICEM.2018.8666294","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Quantitative gait assessment is made possible through the use of instruments such as 3D motion capture systems (Mo-Cap) and its cheaper counterpart, an RGB-D camera. The cost-effectiveness of the RGB-D camera makes it a more practical instrument to use in most clinical settings but it is not as accurate as the Mo-Cap. This paper presents denoising of gait waveforms obtained from the most common parameters produced by RGB-D camera using LMS adaptive filter. The data used came from 14 study volunteers whose normal walking gait were recorded using VICON cameras and Microsoft Kinect v2 sensors. Spatiotemporal gait parameters were calculated from the two gait waveforms. The adaptive filter was trained using training dataset to create a filter model that was then used for the testing phase. Two given data sets, unfiltered and filtered gait parameters, were compared to the motion capture system gait parameters using statistical tools. Unfiltered parameters from the RGB-D camera exhibit significant difference with the Mo-Cap parameters at an average percent error of 26.83%. By calculating statistical values such as mean, standard deviation and root mean square error (RMSE), the findings confirmed that filtered parameter data improved at an average of 11.34% and there is a reduction of rejected parameters by 6 using t-test.