Amir Samadi, Mohammad-Reza Azizi, S. Kashef, M. Akbarzadeh-T., Alireza Akbarzadeh-T, A. Moradi
{"title":"Hand Prosthesis: Finger Localization Based on Forearm Ultrasound Imaging","authors":"Amir Samadi, Mohammad-Reza Azizi, S. Kashef, M. Akbarzadeh-T., Alireza Akbarzadeh-T, A. Moradi","doi":"10.1109/ICRoM48714.2019.9071794","DOIUrl":null,"url":null,"abstract":"With the advancement in mechanical characteristics of prosthetic hands, the need to develop a novel control strategy is crucial. Although surface electromyography (sEMG) is a functional human-machine interface method in various commercial prostheses, it has practical limitations such as a low signal-to-noise ratio. This paper focuses on the forearm ultrasound imaging method to recognize individual finger movement. In contrast to other published research, dedicated to only discriminating hand gestures, we present a method to control hand prostheses by the angles of each finger. By taking ultrasound imaging from a healthy male subject while flexing and extending his finger, and labeling them through attaching a checkerboard to the fingers, the FUMUS (Ferdowsi University UltraSound) images are produced. Due to the ability of convolutional neural network to extract features, we design an end-to-end system for each of four deep convolutional neural networks named Visual Geometry Group Networks (VGG–16 and −19), MobileNet V1 and V2 and used 90% of our dataset to train the networks and validate their performance in recognizing the label of new forearm ultrasound images. Results show approximately 1 degree Mean Absolute Error (MAE) between the labels of the unseen 10% dataset to neural networks and the exact label of them.","PeriodicalId":191113,"journal":{"name":"2019 7th International Conference on Robotics and Mechatronics (ICRoM)","volume":"100 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 7th International Conference on Robotics and Mechatronics (ICRoM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRoM48714.2019.9071794","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
With the advancement in mechanical characteristics of prosthetic hands, the need to develop a novel control strategy is crucial. Although surface electromyography (sEMG) is a functional human-machine interface method in various commercial prostheses, it has practical limitations such as a low signal-to-noise ratio. This paper focuses on the forearm ultrasound imaging method to recognize individual finger movement. In contrast to other published research, dedicated to only discriminating hand gestures, we present a method to control hand prostheses by the angles of each finger. By taking ultrasound imaging from a healthy male subject while flexing and extending his finger, and labeling them through attaching a checkerboard to the fingers, the FUMUS (Ferdowsi University UltraSound) images are produced. Due to the ability of convolutional neural network to extract features, we design an end-to-end system for each of four deep convolutional neural networks named Visual Geometry Group Networks (VGG–16 and −19), MobileNet V1 and V2 and used 90% of our dataset to train the networks and validate their performance in recognizing the label of new forearm ultrasound images. Results show approximately 1 degree Mean Absolute Error (MAE) between the labels of the unseen 10% dataset to neural networks and the exact label of them.