John M. Rattray, Maxwell Ujhazy, Robert Stevens, Ralph Etienne-Cummings
{"title":"基于机器学习的开放式数字识别辅助多模态可穿戴设备","authors":"John M. Rattray, Maxwell Ujhazy, Robert Stevens, Ralph Etienne-Cummings","doi":"10.1109/NER52421.2023.10123870","DOIUrl":null,"url":null,"abstract":"To increase access to digital systems for populations suffering from upper limb motor impairment we present an assistive wearable device to capture gestures performed in air. These open air gestures provide an interface for users who are unable to exhibit the fine motor control needed for standardized human computer interfaces utilizing miniature button input such as keyboards and keypads. By capturing the motion performed at the wrist by an accelerometer as well as the muscle activation signatures using surface electromyography, we improve the classification accuracy as compared to using either modality alone. Twelve features were extracted from the multimodal time series data in both the time and frequency domain and used as input to a collection 4 machine learning models for classification, Fine Tree, K-Nearest Neighbor, Support Vector Machine, and Artificial Neural Network. One subject performed the task of writing single digits in free space and after post-processing and feature extraction we achieved a classification accuracy of 96.2% for binary discrimination of digits zero and one using a support vector machine model and an accuracy of 71% when classifying all 10 digits using an artificial neural network. Our findings indicate the feasibility of a wearable multimodal human computer interface to relieve the burden conventional interfaces present to motor impaired users.","PeriodicalId":201841,"journal":{"name":"2023 11th International IEEE/EMBS Conference on Neural Engineering (NER)","volume":"23 6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Assistive Multimodal Wearable for Open Air Digit Recognition Using Machine Learning\",\"authors\":\"John M. Rattray, Maxwell Ujhazy, Robert Stevens, Ralph Etienne-Cummings\",\"doi\":\"10.1109/NER52421.2023.10123870\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"To increase access to digital systems for populations suffering from upper limb motor impairment we present an assistive wearable device to capture gestures performed in air. These open air gestures provide an interface for users who are unable to exhibit the fine motor control needed for standardized human computer interfaces utilizing miniature button input such as keyboards and keypads. By capturing the motion performed at the wrist by an accelerometer as well as the muscle activation signatures using surface electromyography, we improve the classification accuracy as compared to using either modality alone. Twelve features were extracted from the multimodal time series data in both the time and frequency domain and used as input to a collection 4 machine learning models for classification, Fine Tree, K-Nearest Neighbor, Support Vector Machine, and Artificial Neural Network. One subject performed the task of writing single digits in free space and after post-processing and feature extraction we achieved a classification accuracy of 96.2% for binary discrimination of digits zero and one using a support vector machine model and an accuracy of 71% when classifying all 10 digits using an artificial neural network. Our findings indicate the feasibility of a wearable multimodal human computer interface to relieve the burden conventional interfaces present to motor impaired users.\",\"PeriodicalId\":201841,\"journal\":{\"name\":\"2023 11th International IEEE/EMBS Conference on Neural Engineering (NER)\",\"volume\":\"23 6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 11th International IEEE/EMBS Conference on Neural Engineering (NER)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NER52421.2023.10123870\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 11th International IEEE/EMBS Conference on Neural Engineering (NER)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NER52421.2023.10123870","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Assistive Multimodal Wearable for Open Air Digit Recognition Using Machine Learning
To increase access to digital systems for populations suffering from upper limb motor impairment we present an assistive wearable device to capture gestures performed in air. These open air gestures provide an interface for users who are unable to exhibit the fine motor control needed for standardized human computer interfaces utilizing miniature button input such as keyboards and keypads. By capturing the motion performed at the wrist by an accelerometer as well as the muscle activation signatures using surface electromyography, we improve the classification accuracy as compared to using either modality alone. Twelve features were extracted from the multimodal time series data in both the time and frequency domain and used as input to a collection 4 machine learning models for classification, Fine Tree, K-Nearest Neighbor, Support Vector Machine, and Artificial Neural Network. One subject performed the task of writing single digits in free space and after post-processing and feature extraction we achieved a classification accuracy of 96.2% for binary discrimination of digits zero and one using a support vector machine model and an accuracy of 71% when classifying all 10 digits using an artificial neural network. Our findings indicate the feasibility of a wearable multimodal human computer interface to relieve the burden conventional interfaces present to motor impaired users.