Leveraging Dual-Observable Input for Fine-Grained Thumb Interaction Using Forearm EMG

D. Huang, Xiaoyi Zhang, T. S. Saponas, J. Fogarty, Shyamnath Gollakota
{"title":"Leveraging Dual-Observable Input for Fine-Grained Thumb Interaction Using Forearm EMG","authors":"D. Huang, Xiaoyi Zhang, T. S. Saponas, J. Fogarty, Shyamnath Gollakota","doi":"10.1145/2807442.2807506","DOIUrl":null,"url":null,"abstract":"We introduce the first forearm-based EMG input system that can recognize fine-grained thumb gestures, including left swipes, right swipes, taps, long presses, and more complex thumb motions. EMG signals for thumb motions sensed from the forearm are quite weak and require significant training data to classify. We therefore also introduce a novel approach for minimally-intrusive collection of labeled training data for always-available input devices. Our dual-observable input approach is based on the insight that interaction observed by multiple devices allows recognition by a primary device (e.g., phone recognition of a left swipe gesture) to create labeled training examples for another (e.g., forearm-based EMG data labeled as a left swipe). We implement a wearable prototype with dry EMG electrodes, train with labeled demonstrations from participants using their own phones, and show that our prototype can recognize common fine-grained thumb gestures and user-defined complex gestures.","PeriodicalId":103668,"journal":{"name":"Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"32","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2807442.2807506","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 32

Abstract

We introduce the first forearm-based EMG input system that can recognize fine-grained thumb gestures, including left swipes, right swipes, taps, long presses, and more complex thumb motions. EMG signals for thumb motions sensed from the forearm are quite weak and require significant training data to classify. We therefore also introduce a novel approach for minimally-intrusive collection of labeled training data for always-available input devices. Our dual-observable input approach is based on the insight that interaction observed by multiple devices allows recognition by a primary device (e.g., phone recognition of a left swipe gesture) to create labeled training examples for another (e.g., forearm-based EMG data labeled as a left swipe). We implement a wearable prototype with dry EMG electrodes, train with labeled demonstrations from participants using their own phones, and show that our prototype can recognize common fine-grained thumb gestures and user-defined complex gestures.
利用前臂肌电图利用双可观察输入进行细粒度拇指交互
我们推出了首个基于前臂的肌电图输入系统,该系统可以识别细粒度的拇指手势,包括向左滑动、向右滑动、轻击、长按和更复杂的拇指动作。前臂感知到的拇指运动的肌电信号非常弱,需要大量的训练数据来分类。因此,我们也引入了一种新颖的方法,为始终可用的输入设备提供最小侵入性的标记训练数据收集。我们的双可观察输入方法是基于这样一种见解,即多个设备观察到的交互允许主设备识别(例如,手机识别左滑动手势)为另一个设备创建标记的训练示例(例如,标记为左滑动的基于前臂的肌电图数据)。我们使用干式肌电图电极实现了一个可穿戴原型,使用参与者使用自己的手机进行标记演示训练,并表明我们的原型可以识别常见的细粒度拇指手势和用户自定义的复杂手势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信