Donghan Liu, Dinghuang Zhang, Gongyue Zhang, Honghai Liu
{"title":"Dynamic Hand Gesture Recognition Based on A-Mode Ultrasound Sensing: Proposing an Algorithm Based on the Long Short-Term Memory Framework","authors":"Donghan Liu, Dinghuang Zhang, Gongyue Zhang, Honghai Liu","doi":"10.1109/msmc.2023.3299431","DOIUrl":null,"url":null,"abstract":"Hand gesture recognition plays a crucial role in the field of human–computer interaction (HCI). In terms of the multimodal sensing of hand gestures, the A-mode ultrasound (AUS) signal is far less investigated, especially for dynamic hand gestures, than its counterparts, such as surface electromyography (sEMG). In this article, we explore the recognition of dynamic hand gestures by proposing an AUS-based deep learning algorithm that codes time correlation in the long short-term memory (LSTM) framework. First, a dynamic handwritten numbers 0 through 9 dataset was created and recorded. Then, after preprocessing the data, we propose an algorithm based on the deep learning framework. Also, we designed two different strategies that used two different structures for comparison. Finally, through experiments, the accuracy of different deep learning structures [convolutional neural network (CNN) and LSTM] and traditional feature extraction [support vector machine (SVM)] on dynamic gesture recognition of ultrasonic (US) signals are compared, and we prove that LSTM has better performance. The experiment results prove that the proposed method achieves 89.5% accuracy, which outperforms its counterparts. It paves the way for potential HCI applications involving dynamic hand gestures. It is anticipated that more uses of dynamic gesture recognition will be discussed in the future to bring the research into real-life applications.","PeriodicalId":43649,"journal":{"name":"IEEE Systems Man and Cybernetics Magazine","volume":"189 1","pages":"0"},"PeriodicalIF":1.9000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Systems Man and Cybernetics Magazine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/msmc.2023.3299431","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
Hand gesture recognition plays a crucial role in the field of human–computer interaction (HCI). In terms of the multimodal sensing of hand gestures, the A-mode ultrasound (AUS) signal is far less investigated, especially for dynamic hand gestures, than its counterparts, such as surface electromyography (sEMG). In this article, we explore the recognition of dynamic hand gestures by proposing an AUS-based deep learning algorithm that codes time correlation in the long short-term memory (LSTM) framework. First, a dynamic handwritten numbers 0 through 9 dataset was created and recorded. Then, after preprocessing the data, we propose an algorithm based on the deep learning framework. Also, we designed two different strategies that used two different structures for comparison. Finally, through experiments, the accuracy of different deep learning structures [convolutional neural network (CNN) and LSTM] and traditional feature extraction [support vector machine (SVM)] on dynamic gesture recognition of ultrasonic (US) signals are compared, and we prove that LSTM has better performance. The experiment results prove that the proposed method achieves 89.5% accuracy, which outperforms its counterparts. It paves the way for potential HCI applications involving dynamic hand gestures. It is anticipated that more uses of dynamic gesture recognition will be discussed in the future to bring the research into real-life applications.