{"title":"EXG可穿戴人机界面,实现虚拟现实环境下的自然多模态交互","authors":"Ker-Jiun Wang, Quanbo Liu, Soumya Vhasure, Quanfeng Liu, C. Zheng, Prakash Thakur","doi":"10.1145/3281505.3281577","DOIUrl":null,"url":null,"abstract":"Current assistive technologies are complicated, cumbersome, not portable, and users still need to apply extensive fine motor control to operate the device. Brain-Computer Interfaces (BCIs) could provide an alternative approach to solve these problems. However, the current BCIs have low classification accuracy and require tedious human-learning procedures. The use of complicated Electroencephalogram (EEG) caps, where many electrodes must be attached on the user's head to identify imaginary motor commands, brings a lot of inconvenience. In this demonstration, we will showcase EXGbuds, a compact, non-obtrusive, and comfortable wearable device with non-invasive biosensing technology. People can comfortably wear it for long hours without tiring. Under our developed machine learning algorithms, we can identify various eye movements and facial expressions with over 95% accuracy, such that people with motor disabilities could have a fun time to play VR games totally \"Hands-free\".","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"EXG wearable human-machine interface for natural multimodal interaction in VR environment\",\"authors\":\"Ker-Jiun Wang, Quanbo Liu, Soumya Vhasure, Quanfeng Liu, C. Zheng, Prakash Thakur\",\"doi\":\"10.1145/3281505.3281577\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Current assistive technologies are complicated, cumbersome, not portable, and users still need to apply extensive fine motor control to operate the device. Brain-Computer Interfaces (BCIs) could provide an alternative approach to solve these problems. However, the current BCIs have low classification accuracy and require tedious human-learning procedures. The use of complicated Electroencephalogram (EEG) caps, where many electrodes must be attached on the user's head to identify imaginary motor commands, brings a lot of inconvenience. In this demonstration, we will showcase EXGbuds, a compact, non-obtrusive, and comfortable wearable device with non-invasive biosensing technology. People can comfortably wear it for long hours without tiring. Under our developed machine learning algorithms, we can identify various eye movements and facial expressions with over 95% accuracy, such that people with motor disabilities could have a fun time to play VR games totally \\\"Hands-free\\\".\",\"PeriodicalId\":138249,\"journal\":{\"name\":\"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3281505.3281577\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3281505.3281577","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
EXG wearable human-machine interface for natural multimodal interaction in VR environment
Current assistive technologies are complicated, cumbersome, not portable, and users still need to apply extensive fine motor control to operate the device. Brain-Computer Interfaces (BCIs) could provide an alternative approach to solve these problems. However, the current BCIs have low classification accuracy and require tedious human-learning procedures. The use of complicated Electroencephalogram (EEG) caps, where many electrodes must be attached on the user's head to identify imaginary motor commands, brings a lot of inconvenience. In this demonstration, we will showcase EXGbuds, a compact, non-obtrusive, and comfortable wearable device with non-invasive biosensing technology. People can comfortably wear it for long hours without tiring. Under our developed machine learning algorithms, we can identify various eye movements and facial expressions with over 95% accuracy, such that people with motor disabilities could have a fun time to play VR games totally "Hands-free".