Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications最新文献

筛选
英文 中文
A novel gaze event detection metric that is not fooled by gaze-independent baselines 一种不受注视无关基线干扰的注视事件检测方法
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319836
Mikhail Startsev, Stefan Göb, M. Dorr
{"title":"A novel gaze event detection metric that is not fooled by gaze-independent baselines","authors":"Mikhail Startsev, Stefan Göb, M. Dorr","doi":"10.1145/3314111.3319836","DOIUrl":"https://doi.org/10.1145/3314111.3319836","url":null,"abstract":"Eye movement classification algorithms are typically evaluated either in isolation (in terms of absolute values of some performance statistic), or in comparison to previously introduced approaches. In contrast to this, we first introduce and thoroughly evaluate a set of both random and above-chance baselines that are completely independent of the eye tracking signal recorded for each considered individual observer. Surprisingly, our baselines often show performance that is either comparable to, or even exceeds the scores of some established eye movement classification approaches, for smooth pursuit detection in particular. In these cases, it may be that (i) algorithm performance is poor, (ii) the data set is overly simplistic with little inter-subject variability of the eye movements, or, alternatively, (iii) the currently used evaluation metrics are inappropriate. Based on these observations, we discuss the level of stimulus dependency of the eye movements in four different data sets. Finally, we propose a novel measure of agreement between true and assigned eye movement events, which, unlike existing metrics, is able to reveal the expected performance gap between the baselines and dedicated algorithms.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133313747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Iris: a tool for designing contextually relevant gaze visualizations 虹膜:用于设计与上下文相关的凝视可视化的工具
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3317958.3318228
Sarah D'Angelo, Jeff Brewer, D. Gergle
{"title":"Iris: a tool for designing contextually relevant gaze visualizations","authors":"Sarah D'Angelo, Jeff Brewer, D. Gergle","doi":"10.1145/3317958.3318228","DOIUrl":"https://doi.org/10.1145/3317958.3318228","url":null,"abstract":"Advances in eye tracking technology have enabled new interaction techniques and gaze-based applications. However, the techniques for visualizing gaze information have remained relatively unchanged. We developed Iris, a tool to support the design of contextually relevant gaze visualizations. Iris allows users to explore displaying different features of gaze behavior including the current fixation point, duration, and saccades. Stylistic elements such as color, opacity, and smoothness can also be adjusted to give users creative and detailed control over the design of their gaze visualization. We present the Iris system and perform a user study to examine how participants can make use of the tool to devise contextually relevant gaze visualizations for a variety of collaborative tasks. We show that changes in color and opacity as well as variation in gaze trails can be adjusted to create meaningful gaze visualizations that fit the context of use.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125184808","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Estimation of situation awareness score and performance using eye and head gaze for human-robot collaboration 基于眼睛和头部注视的人机协作态势感知得分和性能估计
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3322504
L. Paletta, Amir Dini, Cornelia Murko, S. Yahyanejad, U. Augsdörfer
{"title":"Estimation of situation awareness score and performance using eye and head gaze for human-robot collaboration","authors":"L. Paletta, Amir Dini, Cornelia Murko, S. Yahyanejad, U. Augsdörfer","doi":"10.1145/3314111.3322504","DOIUrl":"https://doi.org/10.1145/3314111.3322504","url":null,"abstract":"Human attention processes play a major role in the optimization of human-robot collaboration (HRC) [Huang et al. 2015]. We describe a novel methodology to measure and predict situation awareness from eye and head gaze features in real-time. The awareness about scene objects of interest was described by 3D gaze analysis using data from eye tracking glasses and a precise optical tracking system. A probabilistic framework of uncertainty considers coping with measurement errors in eye and position estimation. Comprehensive experiments on HRC were conducted with typical tasks including handover in a lab based prototypical manufacturing environment. The gaze features highly correlate with scores of standardized questionnaires of situation awareness (SART [Taylor 1990], SAGAT [Endsley 2000]) and predict performance in the HRC task. This will open new opportunities for human factors based optimization in HRC applications.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125185730","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Classification of strategies for solving programming problems using AoI sequence analysis 利用AoI序列分析解决规划问题的策略分类
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319825
U. Obaidellah, Michael Raschke, Tanja Blascheck
{"title":"Classification of strategies for solving programming problems using AoI sequence analysis","authors":"U. Obaidellah, Michael Raschke, Tanja Blascheck","doi":"10.1145/3314111.3319825","DOIUrl":"https://doi.org/10.1145/3314111.3319825","url":null,"abstract":"This eye tracking study examines participants' visual attention when solving algorithmic problems in the form of programming problems. The stimuli consisted of a problem statement, example output, and a set of multiple-choice questions regarding variables, data types, and operations needed to solve the programming problems. We recorded eye movements of students and performed an Area of Interest (Aol) sequence analysis to identify reading strategies in terms of participants' performance and visual effort. Using classical eye tracking metrics and a visual Aol sequence analysis we identified two main groups of participants---effective and ineffective problem solvers. This indicates that diversity of participants' mental schemas leads to a difference in their performance. Therefore, identifying how participants' reading behavior varies at a finer level of granularity warrants further investigation.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"6 Suppl 5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130287737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Calibration-free text entry using smooth pursuit eye movements 使用平滑的眼球运动,无需校准的文本输入
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319838
Yasmeen Abdrabou, Mariam Mostafa, M. Khamis, Amr Elmougy
{"title":"Calibration-free text entry using smooth pursuit eye movements","authors":"Yasmeen Abdrabou, Mariam Mostafa, M. Khamis, Amr Elmougy","doi":"10.1145/3314111.3319838","DOIUrl":"https://doi.org/10.1145/3314111.3319838","url":null,"abstract":"In this paper, we propose a calibration-free gaze-based text entry system that uses smooth pursuit eye movements. We report on our implementation, which improves over prior work on smooth pursuit text entry by 1) eliminating the need of calibration using motion correlation, 2) increasing input rate from 3.34 to 3.41 words per minute, 3) featuring text suggestions that were trained on 10,000 lexicon sentences recommended in the literature. We report on a user study (N=26) which shows that users are able to eye type at 3.41 words per minutes without calibration and without user training. Qualitative feedback also indicates that users positively perceive the system. Our work is of particular benefit for disabled users and for situations when voice and tactile input are not feasible (e.g., in noisy environments or when the hands are occupied).","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127187321","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
A fast approach to refraction-aware eye-model fitting and gaze prediction 折射感知眼模型拟合与凝视预测的快速方法
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319819
K. Dierkes, Moritz Kassner, A. Bulling
{"title":"A fast approach to refraction-aware eye-model fitting and gaze prediction","authors":"K. Dierkes, Moritz Kassner, A. Bulling","doi":"10.1145/3314111.3319819","DOIUrl":"https://doi.org/10.1145/3314111.3319819","url":null,"abstract":"By temporally integrating information about pupil contours extracted from eye images, model-based methods for glint-free gaze estimation can mitigate pupil detection noise. However, current approaches require time-consuming iterative solving of a nonlinear minimization problem to estimate key parameters, such as eyeball position. Based on the method presented by [Swirski and Dodgson 2013], we propose a novel approach to glint-free 3D eye-model fitting and gaze prediction using a single near-eye camera. By recasting model optimization as a least-squares intersection of lines, we make it amenable to a fast non-iterative solution. We further present a method for estimating deterministic refraction-correction functions from synthetic eye images and validate them on both synthetic and real eye images. We demonstrate the robustness of our method in the presence of pupil detection noise and show the benefit of temporal integration of pupil contour information on eyeball position and gaze estimation accuracy.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129557458","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
Ferns for area of interest free scanpath classification 蕨类植物无兴趣区扫描路径分类
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319826
Wolfgang Fuhl, Nora Castner, Thomas C. Kübler, Rene Alexander Lotz, W. Rosenstiel, Enkelejda Kasneci
{"title":"Ferns for area of interest free scanpath classification","authors":"Wolfgang Fuhl, Nora Castner, Thomas C. Kübler, Rene Alexander Lotz, W. Rosenstiel, Enkelejda Kasneci","doi":"10.1145/3314111.3319826","DOIUrl":"https://doi.org/10.1145/3314111.3319826","url":null,"abstract":"Scanpath classification can offer insight into the visual strategies of groups such as experts and novices. We propose to use random ferns in combination with saccade angle successions to compare scanpaths. One advantage of our method is that it does not require areas of interest to be computed or annotated. The conditional distribution in random ferns additionally allows for learning angle successions, which do not have to be entirely present in a scanpath. We evaluated our approach on two publicly available datasets and improved the classification accuracy by ≈ 10 and ≈ 20 percent.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"125 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129682266","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 23
POITrack: improving map-based planning with implicit POI tracking POITrack:通过隐式POI跟踪改进基于地图的规划
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3317959.3321491
F. Göbel, P. Kiefer
{"title":"POITrack: improving map-based planning with implicit POI tracking","authors":"F. Göbel, P. Kiefer","doi":"10.1145/3317959.3321491","DOIUrl":"https://doi.org/10.1145/3317959.3321491","url":null,"abstract":"Maps enable complex decision making, such as planning a day trip in a foreign city This kind of task often requires combining information from different parts of the map leading to a sequence of visual searches and map extent changes. Hereby, the user can easily get lost, not being able to find back to relevant points of interest (POI). In this paper, we present POITrack, a novel gaze-adaptive map which supports a user in finding previously inspected POIs faster by providing highlights. Our approach allows filtering inspected POIs based on their category and automatically adapting the current map extent. Not only could participants find visited locations faster with our system, but they also rated the interaction as more pleasing. Our findings can contribute to improving the interaction with high-density visual information, which requires revisiting of previously seen objects whose relevance for the task may not have been clear initially.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121128310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Remote corneal imaging by integrating a 3D face model and an eyeball model 通过集成三维面部模型和眼球模型的远程角膜成像
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319817
Takamasa Utsu, K. Takemura
{"title":"Remote corneal imaging by integrating a 3D face model and an eyeball model","authors":"Takamasa Utsu, K. Takemura","doi":"10.1145/3314111.3319817","DOIUrl":"https://doi.org/10.1145/3314111.3319817","url":null,"abstract":"In corneal imaging methods, it is essential to use a 3D eyeball model for generating an undistorted image. Thus, the relationship between the eye and eye camera is fixed by using a head-mounted device. Remote corneal imaging has several potential applications such as surveillance systems and driver monitoring. Therefore, we integrated a 3D eyeball model with a 3D face model to facilitate remote corneal imaging. We conducted evaluation experiments and confirmed the feasibility of remote corneal imaging. We showed that the center of the eyeball can be estimated based on face tracking, and thus, corneal imaging can function as continuous remote eye tracking.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123254217","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
GazeButton: enhancing buttons with eye gaze interactions GazeButton:增强按钮与眼睛凝视的互动
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3317956.3318154
S. Rivu, Yasmeen Abdrabou, Thomas Mayer, Ken Pfeuffer, Florian Alt
{"title":"GazeButton: enhancing buttons with eye gaze interactions","authors":"S. Rivu, Yasmeen Abdrabou, Thomas Mayer, Ken Pfeuffer, Florian Alt","doi":"10.1145/3317956.3318154","DOIUrl":"https://doi.org/10.1145/3317956.3318154","url":null,"abstract":"The button is an element of a user interface to trigger an action, traditionally using click or touch. We introduce GazeButton, a novel concept extending the default button mode with advanced gaze-based interactions. During normal interaction, users can utilise this button as a universal hub for gaze-based UI shortcuts. The advantages are: 1) easy to integrate in existing UIs, 2) complementary, as users choose either gaze or manual interaction, 3) straightforward, as all features are located in one button, and 4) one button to interact with the whole screen. We explore GazeButtons for a custom-made text reading, writing, and editing tool on a multitouch tablet device. For example, this allows the text cursor position to be set as users look at the position and tap on the GazeButton, avoiding costly physical movement. Or, users can simply gaze over a part of the text that should be selected, while holding the GazeButton. We present a design space, specific application examples, and point to future button designs that become highly expressive by unifying the user's visual and manual input.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"367 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123386052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信