Expressive touch: studying tapping force on tabletops

E. Pedersen, K. Hornbæk
{"title":"Expressive touch: studying tapping force on tabletops","authors":"E. Pedersen, K. Hornbæk","doi":"10.1145/2556288.2557019","DOIUrl":null,"url":null,"abstract":"This paper investigates users' ability to perform force-sensitive tapping and explores its potential as an input modality in touch-based systems. We study force-sensitive tapping using Expressive Touch, a tabletop interface that infers tapping force from the sound waves created by the users' finger upon impact. The first part of the paper describes the implementation details of Expressive Touch and shows how existing tabletop interfaces can be augmented to reliably detect tapping force across the entire surface. The second part of the paper reports on the results of three studies of force-sensitive tapping. First, we use a classic psychophysic task to gain insights into participants' perception of tapping force (Study 1). Results show that although participants tap with different absolute tapping forces, they have a similar perception of relative tapping force. Second, we investigate participants' ability to control tapping force (Study 2) and find that users can produce two force levels with 99% accuracy. For six levels of force, accuracy drops to 58%. Third, we investigate the usability of force tapping by studying participants' reactions to seven force-sensitive touch applications (Study 3).","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2556288.2557019","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25

Abstract

This paper investigates users' ability to perform force-sensitive tapping and explores its potential as an input modality in touch-based systems. We study force-sensitive tapping using Expressive Touch, a tabletop interface that infers tapping force from the sound waves created by the users' finger upon impact. The first part of the paper describes the implementation details of Expressive Touch and shows how existing tabletop interfaces can be augmented to reliably detect tapping force across the entire surface. The second part of the paper reports on the results of three studies of force-sensitive tapping. First, we use a classic psychophysic task to gain insights into participants' perception of tapping force (Study 1). Results show that although participants tap with different absolute tapping forces, they have a similar perception of relative tapping force. Second, we investigate participants' ability to control tapping force (Study 2) and find that users can produce two force levels with 99% accuracy. For six levels of force, accuracy drops to 58%. Third, we investigate the usability of force tapping by studying participants' reactions to seven force-sensitive touch applications (Study 3).
表现力触摸:学习敲击桌面的力度
本文研究了用户执行力敏敲击的能力,并探讨了其作为基于触摸系统的输入方式的潜力。我们使用Expressive Touch研究力敏敲击,这是一种桌面界面,可以从用户手指撞击时产生的声波中推断出敲击力。论文的第一部分描述了表现力触摸的实现细节,并展示了如何增强现有的桌面界面,以可靠地检测整个表面的敲击力。论文的第二部分报告了力敏敲击的三个研究结果。首先,我们使用经典的心理物理任务来了解被试对轻叩力的感知(研究1)。结果表明,尽管被试的绝对轻叩力不同,但他们对相对轻叩力的感知是相似的。其次,我们调查了参与者控制敲击力的能力(研究2),发现用户可以以99%的准确率产生两个力水平。对于六个级别的力量,准确率下降到58%。第三,我们通过研究参与者对7种力敏触摸应用程序的反应来研究力触点的可用性(研究3)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
文献相关原料
公司名称 产品信息 采购帮参考价格
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信