Conference on Novel Gaze-Controlled Applications最新文献

筛选
英文 中文
Gaze interaction from bed 床上的凝视互动
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983313
J. P. Hansen, Javier San Agustin, H. Skovsgaard
{"title":"Gaze interaction from bed","authors":"J. P. Hansen, Javier San Agustin, H. Skovsgaard","doi":"10.1145/1983302.1983313","DOIUrl":"https://doi.org/10.1145/1983302.1983313","url":null,"abstract":"This paper presents a low-cost gaze tracking solution for bedbound people composed of free-ware tracking software and commodity hardware. Gaze interaction is done on a large wall-projected image, visible to all people present in the room. The hardware equipment leaves physical space free to assist the person. Accuracy and precision of the tracking system was tested in an experiment with 12 subjects. We obtained a tracking quality that is sufficiently good to control applications designed for gaze interaction. The best tracking condition were achieved when people were sitting up compared to lying down. Also, gaze tracking in the bottom part of the image was found to be more precise than in the top part.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127062177","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Gaze and voice controlled drawing 目光和声音控制绘图
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983311
J. Kamp, V. Sundstedt
{"title":"Gaze and voice controlled drawing","authors":"J. Kamp, V. Sundstedt","doi":"10.1145/1983302.1983311","DOIUrl":"https://doi.org/10.1145/1983302.1983311","url":null,"abstract":"Eye tracking is a process that allows an observers gaze to be determined in real time by measuring their eye movements. Recent work has examined the possibility of using gaze control as an alternative input modality in interactive applications. Alternative means of interaction are especially important for disabled users for whom traditional techniques, such as mouse and keyboard, may not be feasible. This paper proposes a novel combination of gaze and voice commands as a means of hands free interaction in a paint style program. A drawing application is implemented which is controllable by input from gaze and voice. Voice commands are used to activate drawing which allow gaze to be used only for positioning the cursor. In previous work gaze has also been used to activate drawing using dwell time. The drawing application is evaluated using subjective responses from participant user trials. The main result indicates that although gaze and voice offered less control that traditional input devices, the participants reported that it was more enjoyable.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132490486","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 48
Exploring interaction modes for image retrieval 探索图像检索的交互模式
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983312
C. Engelman, Rui Li, J. Pelz, P. Shi, Anne R. Haake
{"title":"Exploring interaction modes for image retrieval","authors":"C. Engelman, Rui Li, J. Pelz, P. Shi, Anne R. Haake","doi":"10.1145/1983302.1983312","DOIUrl":"https://doi.org/10.1145/1983302.1983312","url":null,"abstract":"The number of digital images in use is growing at an increasing rate across a wide array of application domains. That being said, there is an ever-growing need for innovative ways to help endusers gain access to these images quickly and effectively. Moreover, it is becoming increasingly more difficult to manually annotate these images, for example with text labels, to generate useful metadata. One such method for helping users gain access to digital images is content-based image retrieval (CBIR). Practical use of CBIR systems has been limited by several \"gaps\", including the well-known semantic gap and usability gaps [1]. Innovative designs are needed to bring end users into the loop to bridge these gaps. Our human-centered approaches integrate human perception and multimodal interaction to facilitate more usable and effective image retrieval. Here we show that multi-touch interaction is more usable than gaze based interaction for explicit image region selection.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127566541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evaluation of a remote webcam-based eye tracker 基于远程网络摄像头的眼动仪的评估
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983309
H. Skovsgaard, Javier San Agustin, Sune Alstrup Johansen, J. P. Hansen, M. Tall
{"title":"Evaluation of a remote webcam-based eye tracker","authors":"H. Skovsgaard, Javier San Agustin, Sune Alstrup Johansen, J. P. Hansen, M. Tall","doi":"10.1145/1983302.1983309","DOIUrl":"https://doi.org/10.1145/1983302.1983309","url":null,"abstract":"In this paper we assess the performance of an open-source gaze tracker in a remote (i.e. table-mounted) setup, and compare it with two other commercial eye trackers. An experiment with 5 subjects showed the open-source eye tracker to have a significantly higher level of accuracy than one of the commercial systems, Mirametrix S1, but also a higher error rate than the other commercial system, a Tobii T60. We conclude that the web-camera solution may be viable for people who need a substitute for the mouse input but cannot afford a commercial system.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115027751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Comparison of gaze-to-objects mapping algorithms 注视对象映射算法的比较
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983308
O. Špakov
{"title":"Comparison of gaze-to-objects mapping algorithms","authors":"O. Špakov","doi":"10.1145/1983302.1983308","DOIUrl":"https://doi.org/10.1145/1983302.1983308","url":null,"abstract":"Gaze data processing is an important and necessary step in gaze-based applications. This study focuses on the comparison of several gaze-to-object mapping algorithms using various dwell times for selection and presenting targets of several types and sizes. Seven algorithms found in literature were compared against two newly designed algorithms. The study revealed that a fractional mapping algorithm (known) has produced the highest rate of correct selections and fastest selection times, but also the highest rate of incorrect selections. The dynamic competing algorithm (designed) has shown the next best result, but also high rate of incorrect selections. A small impact on the type of target to the calculated statistics has been observed. A strictly centered gazing has helped to increase the rate of correct selections for all algorithms and types of targets. The directions for further mapping algorithms improvement and future investigation have been explained.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"255 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114546911","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Mobile gaze-based screen interaction in 3D environments 在3D环境中基于移动视线的屏幕交互
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983304
D. Mardanbegi, D. Hansen
{"title":"Mobile gaze-based screen interaction in 3D environments","authors":"D. Mardanbegi, D. Hansen","doi":"10.1145/1983302.1983304","DOIUrl":"https://doi.org/10.1145/1983302.1983304","url":null,"abstract":"Head-mounted eye trackers can be used for mobile interaction as well as gaze estimation purposes. This paper presents a method that enables the user to interact with any planar digital display in a 3D environment using a head-mounted eye tracker. An effective method for identifying the screens in the field of view of the user is also presented which can be applied in a general scenario in which multiple users can interact with multiple screens. A particular application of using this technique is implemented in a home environment with two big screens and a mobile phone. In this application a user was able to interact with these screens using a wireless head-mounted eye tracker.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128662359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 46
Towards intelligent user interfaces: anticipating actions in computer games 迈向智能用户界面:预测电脑游戏中的动作
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983306
Hendrik Koesling, A. Kenny, A. Finke, H. Ritter, S. McLoone, Tomas E. Ward
{"title":"Towards intelligent user interfaces: anticipating actions in computer games","authors":"Hendrik Koesling, A. Kenny, A. Finke, H. Ritter, S. McLoone, Tomas E. Ward","doi":"10.1145/1983302.1983306","DOIUrl":"https://doi.org/10.1145/1983302.1983306","url":null,"abstract":"The study demonstrates how the on-line processing of eye movements in First Person Shooter (FPS) games helps to predict player decisions regarding subsequent actions. Based on action-control theory, we identify distinct cognitive orientations in pre- and post-decisional phases. Cognitive orientations differ with regard to the width of attention or \"re-ceptiveness\": In the pre-decisional phase players process as much information as possible and then focus on implementing intended actions in the post-decisional phase. Participants viewed animated sequences of FPS games and decided which game character to rescue and how to implement their action. Oculomotor data shows a clear distinction between the width of attention in pre- and post-decisional phases, supporting the Rubicon model of action phases. Attention rapidly narrows when the goal intention is formed. We identify a lag of 800--900 ms between goal formation (\"cognitive Rubicon\") and motor response. Game engines may use this lag to anticipatively respond to actions that players have not executed yet. User interfaces with a gaze-dependent, gaze-controlled anticipation module should thus enhance game character behaviours and make them much \"smarter\".","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126053476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Hyakunin-Eyesshu: a tabletop Hyakunin-Isshu game with computer opponent by the action prediction based on gaze detection hyakunin - eyeshu:一款基于注视检测的动作预测与电脑对手的桌面Hyakunin-Isshu游戏
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983307
Michiya Yamamoto, M. Komeda, Takashi Nagamatsu, Tomio Watanabe
{"title":"Hyakunin-Eyesshu: a tabletop Hyakunin-Isshu game with computer opponent by the action prediction based on gaze detection","authors":"Michiya Yamamoto, M. Komeda, Takashi Nagamatsu, Tomio Watanabe","doi":"10.1145/1983302.1983307","DOIUrl":"https://doi.org/10.1145/1983302.1983307","url":null,"abstract":"A tabletop interface can enable interactions between images and real objects using various sensors; therefore, it can be used to create many works in the media arts field. By focusing on gaze-and-touch interaction, we proposed the concept of an eye-tracking tabletop interface (ETTI) as a new type of interaction interface for the creation of media artworks. In this study, we developed \"Hyakunin-Eyesshu,\" a prototype for ETTI content that enables users to play the traditional Japanese card game \"Hyakunin-Isshu\" with a computer character. In addition, we demonstrated this system at an academic meeting and obtained user feedback. We expect that our work will lead to advancements in interfaces for various interactions and to various new media artworks with precise gaze estimation.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130695979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Eye tracking within the packaging design workflow: interaction with physical and virtual shelves 包装设计工作流程中的眼动追踪:与物理和虚拟货架的互动
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983305
C. Tonkin, Andrew D. Ouzts, A. Duchowski
{"title":"Eye tracking within the packaging design workflow: interaction with physical and virtual shelves","authors":"C. Tonkin, Andrew D. Ouzts, A. Duchowski","doi":"10.1145/1983302.1983305","DOIUrl":"https://doi.org/10.1145/1983302.1983305","url":null,"abstract":"Measuring consumers' overt visual attention through eye tracking is a useful method of assessing a package design's impact on likely buyer purchase patterns. To preserve ecological validity, subjects should remain immersed in a shopping context throughout the entire study. Immersion can be achieved through proper priming, environmental cues, and visual stimuli. While a complete physical store offers the most realistic environment, the use of projectors in creating a virtual environment is desirable for efficiency, cost, and flexibility reasons. Results are presented from a study comparing consumers' visual behavior in the presence of either virtual or physical shelving through eye movement performance and process metrics and their subjective impressions. Analysis suggests a difference in visual search performance between environments even though the perceived difference is negligible.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"196 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115486991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 48
Designing gaze-supported multimodal interactions for the exploration of large image collections 为探索大型图像集设计支持凝视的多模式交互
Conference on Novel Gaze-Controlled Applications Pub Date : 2011-05-26 DOI: 10.1145/1983302.1983303
S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt
{"title":"Designing gaze-supported multimodal interactions for the exploration of large image collections","authors":"S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt","doi":"10.1145/1983302.1983303","DOIUrl":"https://doi.org/10.1145/1983302.1983303","url":null,"abstract":"While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130278539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 67
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信