为探索大型图像集设计支持凝视的多模式交互

S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt
{"title":"为探索大型图像集设计支持凝视的多模式交互","authors":"S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt","doi":"10.1145/1983302.1983303","DOIUrl":null,"url":null,"abstract":"While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"67","resultStr":"{\"title\":\"Designing gaze-supported multimodal interactions for the exploration of large image collections\",\"authors\":\"S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt\",\"doi\":\"10.1145/1983302.1983303\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.\",\"PeriodicalId\":184593,\"journal\":{\"name\":\"Conference on Novel Gaze-Controlled Applications\",\"volume\":\"68 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-05-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"67\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Conference on Novel Gaze-Controlled Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/1983302.1983303\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference on Novel Gaze-Controlled Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1983302.1983303","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 67

摘要

虽然眼动追踪作为一种有前途的输入渠道变得越来越重要,但以更自然的方式使用注视控制的各种应用仍然相当有限。尽管一些研究人员已经指出,在指向任务中,基于凝视的交互具有很高的潜力,但通常只研究凝视的方法。然而,耗时的驻留时间激活限制了这种潜力。为了克服这个问题,我们提出了一个支持凝视的鱼眼镜头,结合了(1)键盘和(2)倾斜敏感移动多点触摸设备。在以用户为中心的设计方法中,我们引出了用户将如何使用上述输入组合。根据收到的反馈,我们设计了一个原型系统,用于使用凝视和触摸倾斜设备与远程显示器进行交互。这消除了凝视驻留时间激活和著名的点石成问题(无意中通过凝视发出一个动作)。对我们的原型进行的形成性的用户研究提供了进一步的见解,以了解用户对精心设计的注视支持的交互技术的体验情况。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Designing gaze-supported multimodal interactions for the exploration of large image collections
While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信