Guiding gaze gestures on smartwatches: Introducing fireworks

IF 5.3 2区 计算机科学 Q1 COMPUTER SCIENCE, CYBERNETICS
William Delamare , Daichi Harada , Luxi Yang , Xiangshi Ren
{"title":"Guiding gaze gestures on smartwatches: Introducing fireworks","authors":"William Delamare ,&nbsp;Daichi Harada ,&nbsp;Luxi Yang ,&nbsp;Xiangshi Ren","doi":"10.1016/j.ijhcs.2023.103196","DOIUrl":null,"url":null,"abstract":"<div><p><span>Smartwatches enable interaction anytime and anywhere, with both digital and augmented physical objects. However, situations with busy hands can prevent user inputs. To address this limitation, we propose Fireworks, an innovative hands-free alternative that empowers smartwatch users to trigger commands effortlessly through intuitive gaze gestures by providing post-activation guidance. Fireworks allows command activation by guiding users to follow targets moving from the screen center to the edge, mimicking real life fireworks. We present the experimental design and evaluation of two Fireworks instances. The first design employs temporal </span>parallelization, displaying few dynamic targets during microinteractions (e.g., snoozing a notification while cooking). The second design sequentially displays targets to support more commands (e.g., 20 commands), ideal for various scenarios other than microinteractions (e.g., turn on lights in a smart home). Results show that Fireworks’ single straight gestures enable faster and more accurate command selection compared to state-of-the-art baselines, namely Orbits and Stroke. Additionally, participants expressed a clear preference for Fireworks’ original visual guidance.</p></div>","PeriodicalId":54955,"journal":{"name":"International Journal of Human-Computer Studies","volume":null,"pages":null},"PeriodicalIF":5.3000,"publicationDate":"2023-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Human-Computer Studies","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1071581923002057","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

Abstract

Smartwatches enable interaction anytime and anywhere, with both digital and augmented physical objects. However, situations with busy hands can prevent user inputs. To address this limitation, we propose Fireworks, an innovative hands-free alternative that empowers smartwatch users to trigger commands effortlessly through intuitive gaze gestures by providing post-activation guidance. Fireworks allows command activation by guiding users to follow targets moving from the screen center to the edge, mimicking real life fireworks. We present the experimental design and evaluation of two Fireworks instances. The first design employs temporal parallelization, displaying few dynamic targets during microinteractions (e.g., snoozing a notification while cooking). The second design sequentially displays targets to support more commands (e.g., 20 commands), ideal for various scenarios other than microinteractions (e.g., turn on lights in a smart home). Results show that Fireworks’ single straight gestures enable faster and more accurate command selection compared to state-of-the-art baselines, namely Orbits and Stroke. Additionally, participants expressed a clear preference for Fireworks’ original visual guidance.

智能手表上的目光引导手势介绍烟花
智能手表可以随时随地与数字对象和增强物理对象进行交互。然而,在手忙脚乱的情况下,用户可能无法输入指令。为了解决这一限制,我们提出了一种创新的免提替代方案--Fireworks,通过提供激活后引导,让智能手表用户能够通过直观的注视手势毫不费力地触发命令。Fireworks 通过引导用户跟随从屏幕中心向边缘移动的目标,模仿现实生活中的烟花,从而激活命令。我们介绍了两个 Fireworks 实例的实验设计和评估。第一种设计采用了时间并行化技术,在微交互过程中显示少量动态目标(例如,在烹饪时打盹接收通知)。第二种设计按顺序显示目标,以支持更多命令(如 20 条命令),非常适合微交互以外的各种场景(如打开智能家居中的灯)。结果表明,与 Orbits 和 Stroke 等最先进的基线软件相比,Fireworks 的单直手势能够实现更快、更准确的命令选择。此外,参与者明确表示更喜欢 Fireworks 的原始视觉引导。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International Journal of Human-Computer Studies
International Journal of Human-Computer Studies 工程技术-计算机:控制论
CiteScore
11.50
自引率
5.60%
发文量
108
审稿时长
3 months
期刊介绍: The International Journal of Human-Computer Studies publishes original research over the whole spectrum of work relevant to the theory and practice of innovative interactive systems. The journal is inherently interdisciplinary, covering research in computing, artificial intelligence, psychology, linguistics, communication, design, engineering, and social organization, which is relevant to the design, analysis, evaluation and application of innovative interactive systems. Papers at the boundaries of these disciplines are especially welcome, as it is our view that interdisciplinary approaches are needed for producing theoretical insights in this complex area and for effective deployment of innovative technologies in concrete user communities. Research areas relevant to the journal include, but are not limited to: • Innovative interaction techniques • Multimodal interaction • Speech interaction • Graphic interaction • Natural language interaction • Interaction in mobile and embedded systems • Interface design and evaluation methodologies • Design and evaluation of innovative interactive systems • User interface prototyping and management systems • Ubiquitous computing • Wearable computers • Pervasive computing • Affective computing • Empirical studies of user behaviour • Empirical studies of programming and software engineering • Computer supported cooperative work • Computer mediated communication • Virtual reality • Mixed and augmented Reality • Intelligent user interfaces • Presence ...
文献相关原料
公司名称 产品信息 采购帮参考价格
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信