{"title":"Multimodal deep learning network based hand ADLs tasks classification for prosthetics control","authors":"L. Zhengyi, Zhou Hui, Yang Dandan, Xie Shui-qing","doi":"10.1109/PIC.2017.8359521","DOIUrl":null,"url":null,"abstract":"Natural control methods based on surface electromyography (sEMG) and pattern recognization are promising for hand prosthetics. However, the control robustness offered by scientific research is still not sufficient for many activities of daily living (ADLs). Difficulty results from limited sEMG signals susceptible to interference in clinical practice, it needs to synthesize hand movement and sEMG to improve classification robustness. Human hand ADLs are made of complex sequences of finger joint movements, and capturing the temporal dynamics is fundamental for successful hand prosthetics control. Current research suggests that recurrent neural networks (RNN) are suited to automate feature extraction for time series domains, and dynamic movement primitives (DMP) can provide representation of hand kinematic primitives. We design a multimodal deep framework for inter-subject ADLs recognization, which: (i) implements heterogeneous sensors fusion; (ii) does not require hand-crafted features; and (iii) contains the dynamics model of the hand ADLs task control. We evaluate our framework on Ninapro datasets. Results show that our framework outperforms competing deep networks with single modal and some of the previous reported results.","PeriodicalId":370588,"journal":{"name":"2017 International Conference on Progress in Informatics and Computing (PIC)","volume":"50 1-5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Progress in Informatics and Computing (PIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PIC.2017.8359521","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Natural control methods based on surface electromyography (sEMG) and pattern recognization are promising for hand prosthetics. However, the control robustness offered by scientific research is still not sufficient for many activities of daily living (ADLs). Difficulty results from limited sEMG signals susceptible to interference in clinical practice, it needs to synthesize hand movement and sEMG to improve classification robustness. Human hand ADLs are made of complex sequences of finger joint movements, and capturing the temporal dynamics is fundamental for successful hand prosthetics control. Current research suggests that recurrent neural networks (RNN) are suited to automate feature extraction for time series domains, and dynamic movement primitives (DMP) can provide representation of hand kinematic primitives. We design a multimodal deep framework for inter-subject ADLs recognization, which: (i) implements heterogeneous sensors fusion; (ii) does not require hand-crafted features; and (iii) contains the dynamics model of the hand ADLs task control. We evaluate our framework on Ninapro datasets. Results show that our framework outperforms competing deep networks with single modal and some of the previous reported results.