{"title":"Discrete Gesture Recognition Using Multimodal PPG, IMU, and Single-Channel EMG Recorded at the Wrist","authors":"Ethan Eddy;Evan Campbell;Ulysse Côté-Allard;Scott Bateman;Erik Scheme","doi":"10.1109/LSENS.2024.3447240","DOIUrl":null,"url":null,"abstract":"Discrete hand-gesture recognition using sensors built into wrist-wearable devices could enable always-available input across a wide range of ubiquitous environments. For example, a user could flick their wrist to dismiss a phone call or tap their thumb and index fingers together to make a selection in mixed reality. To move toward such applications, this work evaluates a new multimodal commercially available device (the \n<italic>BioPoint</i>\n by \n<italic>SIFI Labs</i>\n) for recognizing seven dynamic hand gestures. Three sensors were evaluated, including a single channel of electromyography (EMG), a three-axis accelerometer (ACC), and photoplethysmography (PPG). Using a deep LSTM-based network, the relative performance of each sensor and all possible combinations were compared for their gesture classification abilities. The results show that the combination of all sensors led to the highest classification accuracy (\n<inline-formula><tex-math>$>$</tex-math></inline-formula>\n96%), significantly outperforming the individual performance of each sensor (p \n<inline-formula><tex-math>$< $</tex-math></inline-formula>\n 0.05). In addition, the fusion of all sensors significantly improved performance across days (p \n<inline-formula><tex-math>$< $</tex-math></inline-formula>\n 0.05) and was significantly more resilient when classifying gestures elicited in unseen limb positions (p \n<inline-formula><tex-math>$< $</tex-math></inline-formula>\n 0.05). These results highlight the complementary benefits of fusing EMG, ACC, and PPG signals as a viable path forward for the reliable recognition of discrete event-driven gestures using wrist-based wearables.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10643274/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Discrete hand-gesture recognition using sensors built into wrist-wearable devices could enable always-available input across a wide range of ubiquitous environments. For example, a user could flick their wrist to dismiss a phone call or tap their thumb and index fingers together to make a selection in mixed reality. To move toward such applications, this work evaluates a new multimodal commercially available device (the
BioPoint
by
SIFI Labs
) for recognizing seven dynamic hand gestures. Three sensors were evaluated, including a single channel of electromyography (EMG), a three-axis accelerometer (ACC), and photoplethysmography (PPG). Using a deep LSTM-based network, the relative performance of each sensor and all possible combinations were compared for their gesture classification abilities. The results show that the combination of all sensors led to the highest classification accuracy (
$>$
96%), significantly outperforming the individual performance of each sensor (p
$< $
0.05). In addition, the fusion of all sensors significantly improved performance across days (p
$< $
0.05) and was significantly more resilient when classifying gestures elicited in unseen limb positions (p
$< $
0.05). These results highlight the complementary benefits of fusing EMG, ACC, and PPG signals as a viable path forward for the reliable recognition of discrete event-driven gestures using wrist-based wearables.