TriTap: Identifying Finger Touches on Smartwatches

Hyunjae Gil, Doyoung Lee, Seunggyu Im, Ian Oakley
{"title":"TriTap: Identifying Finger Touches on Smartwatches","authors":"Hyunjae Gil, Doyoung Lee, Seunggyu Im, Ian Oakley","doi":"10.1145/3025453.3025561","DOIUrl":null,"url":null,"abstract":"The small screens of smartwatches provide limited space for input tasks. Finger identification is a promising technique to address this problem by associating different functions with different fingers. However, current technologies for finger identification are unavailable or unsuitable for smartwatches. To address this problem, this paper observes that normal smartwatch use takes places with a relatively static pose between the two hands. In this situation, we argue that the touch and angle profiles generated by different fingers on a standard smartwatch touch screen will differ sufficiently to support reliable identification. The viability of this idea is explored in two studies that capture touches in natural and exaggerated poses during tapping and swiping tasks. Machine learning models report accuracies of up to 93% and 98% respectively, figures that are sufficient for many common interaction tasks. Furthermore, the exaggerated poses show modest costs (in terms of time/errors) compared to the natural touches. We conclude by presenting examples and discussing how interaction designs using finger identification can be adapted to the smartwatch form factor.","PeriodicalId":299396,"journal":{"name":"Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems","volume":"58 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"43","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3025453.3025561","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 43

Abstract

The small screens of smartwatches provide limited space for input tasks. Finger identification is a promising technique to address this problem by associating different functions with different fingers. However, current technologies for finger identification are unavailable or unsuitable for smartwatches. To address this problem, this paper observes that normal smartwatch use takes places with a relatively static pose between the two hands. In this situation, we argue that the touch and angle profiles generated by different fingers on a standard smartwatch touch screen will differ sufficiently to support reliable identification. The viability of this idea is explored in two studies that capture touches in natural and exaggerated poses during tapping and swiping tasks. Machine learning models report accuracies of up to 93% and 98% respectively, figures that are sufficient for many common interaction tasks. Furthermore, the exaggerated poses show modest costs (in terms of time/errors) compared to the natural touches. We conclude by presenting examples and discussing how interaction designs using finger identification can be adapted to the smartwatch form factor.
TriTap:识别智能手表上的手指触摸
智能手表的小屏幕为输入任务提供了有限的空间。通过将不同的功能与不同的手指相关联,手指识别是解决这一问题的一种很有前途的技术。然而,目前的手指识别技术并不适用于智能手表。为了解决这个问题,本文观察到正常的智能手表使用是在两只手之间相对静止的姿势。在这种情况下,我们认为不同手指在标准智能手表触摸屏上产生的触摸和角度轮廓会有足够的差异,以支持可靠的识别。这一想法的可行性在两项研究中得到了探讨,这两项研究捕捉了在敲击和滑动任务中自然和夸张的触摸姿势。机器学习模型报告的准确率分别高达93%和98%,对于许多常见的交互任务来说,这些数字已经足够了。此外,与自然触感相比,夸张的姿势显示出适度的成本(在时间/错误方面)。最后,我们给出了一些例子,并讨论了使用手指识别的交互设计如何适应智能手表的外形因素。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信