Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications最新文献

筛选
英文 中文
Binocular model-based gaze estimation with a camera and a single infrared light source 基于摄像机和单红外光源双目模型的凝视估计
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204557
Laura Sesma, D. W. Hansen
{"title":"Binocular model-based gaze estimation with a camera and a single infrared light source","authors":"Laura Sesma, D. W. Hansen","doi":"10.1145/3204493.3204557","DOIUrl":"https://doi.org/10.1145/3204493.3204557","url":null,"abstract":"We propose a binocular model-based method that only uses a single camera and an infrared light source. Most gaze estimation approaches are based on single eye models and with binocular models they are addressed by averaging the results from each eye. In this work, we propose a geometric model of both eyes for gaze estimation. The proposed model is implemented and evaluated in a simulated environment and is compared to a binocular model-based method and polynomial regression-based method with one camera and two infrared lights that average the results from both eyes. The method performs on par with methods using multiple light sources while maintaining robustness to head movements. The study shows that when using both eyes in gaze estimation models it is possible to reduce the hardware requirements while maintaining robustness.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116377483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
CBF CBF
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204559
Wolfgang Fuhl, D. Geisler, Thiago Santini, Tobias Appel, W. Rosenstiel, Enkelejda Kasneci
{"title":"CBF","authors":"Wolfgang Fuhl, D. Geisler, Thiago Santini, Tobias Appel, W. Rosenstiel, Enkelejda Kasneci","doi":"10.1145/3204493.3204559","DOIUrl":"https://doi.org/10.1145/3204493.3204559","url":null,"abstract":"Modern eye tracking systems rely on fast and robust pupil detection, and several algorithms have been proposed for eye tracking under real world conditions. In this work, we propose a novel binary feature selection approach that is trained by computing conditional distributions. These features are scalable and rotatable, allowing for distinct image resolutions, and consist of simple intensity comparisons, making the approach robust to different illumination conditions as well as rapid illumination changes. The proposed method was evaluated on multiple publicly available data sets, considerably outperforming state-of-the-art methods, and being real-time capable for very high frame rates. Moreover, our method is designed to be able to sustain pupil center estimation even when typical edge-detection-based approaches fail - e.g., when the pupil outline is not visible due to occlusions from reflections or eye lids / lashes. As a consequece, it does not attempt to provide an estimate for the pupil outline. Nevertheless, the pupil center suffices for gaze estimation - e.g., by regressing the relationship between pupil center and gaze point during calibration.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"207 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115185177","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Use of attentive information dashboards to support task resumption in working environments 使用细心的信息仪表板来支持工作环境中的任务恢复
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208348
Peyman Toreini, Moritz Langner, A. Maedche
{"title":"Use of attentive information dashboards to support task resumption in working environments","authors":"Peyman Toreini, Moritz Langner, A. Maedche","doi":"10.1145/3204493.3208348","DOIUrl":"https://doi.org/10.1145/3204493.3208348","url":null,"abstract":"Interruptions are known as one of the big challenges in working environments. Due to improper resuming the primary task, such interruptions may result in task resumption failures and negatively influence the task performance. This phenomenon also occurs when users are working with information dashboards in working environments. To address this problem, an attentive dashboard issuing visual feedback is developed. This feedback supports the user in resuming the primary task after the interruption by guiding the visual attention. The attentive dashboard captures visual attention allocation of the user with a low-cost screen-based eye-tracker while they are monitoring the graphs. This dashboard is sensitive to the occurrence of external interruption by tracking the eye-movement data in real-time. Moreover, based on the collected eye-movement data, two types of visual feedback are designed which highlight the last fixated graph and unnoticed ones.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122551532","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Intelligent cockpit: eye tracking integration to enhance the pilot-aircraft interaction 智能座舱:眼动追踪集成,增强人机交互
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3207420
C. Lounis, Vsevolod Peysakhovich, M. Causse
{"title":"Intelligent cockpit: eye tracking integration to enhance the pilot-aircraft interaction","authors":"C. Lounis, Vsevolod Peysakhovich, M. Causse","doi":"10.1145/3204493.3207420","DOIUrl":"https://doi.org/10.1145/3204493.3207420","url":null,"abstract":"In this research, we use eye tracking to monitor the attentional behavior of pilots in the cockpit. We built a cockpit monitoring database that serves as a reference for real-time assessment of the pilot's monitoring strategies, based on numerous flight simulator sessions with eye-tracking recordings. Eye tracking may also be employed as a passive input for assistive system, future studies will also explore the possibility to adapt the notifications' modality using gaze.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117219904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Image-based scanpath comparison with slit-scan visualization 基于图像的扫描路径与裂隙扫描可视化的比较
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204581
Maurice Koch, K. Kurzhals, D. Weiskopf
{"title":"Image-based scanpath comparison with slit-scan visualization","authors":"Maurice Koch, K. Kurzhals, D. Weiskopf","doi":"10.1145/3204493.3204581","DOIUrl":"https://doi.org/10.1145/3204493.3204581","url":null,"abstract":"The comparison of scanpaths between multiple participants is an important analysis task in eye tracking research. Established methods typically inspect recorded gaze sequences based on geometrical trajectory properties or strings derived from annotated areas of interest (AOIs). We propose a new approach based on image similarities of gaze-guided slit-scans: For each time step, a vertical slice is extracted from the stimulus at the gaze position. Placing the slices next to each other over time creates a compact representation of a scanpath in the context of the stimulus. These visual representations can be compared based on their image similarity, providing a new measure for scanpath comparison without the need for annotation. We demonstrate how comparative slit-scan visualization can be integrated into a visual analytics approach to support the interpretation of scanpath similarities in general.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126435704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Head and gaze control of a telepresence robot with an HMD 带头戴式显示器的远程呈现机器人的头部和凝视控制
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208330
J. P. Hansen, A. Alapetite, Martin Thomsen, Zhongyu Wang, Katsumi Minakata, Guangtao Zhang
{"title":"Head and gaze control of a telepresence robot with an HMD","authors":"J. P. Hansen, A. Alapetite, Martin Thomsen, Zhongyu Wang, Katsumi Minakata, Guangtao Zhang","doi":"10.1145/3204493.3208330","DOIUrl":"https://doi.org/10.1145/3204493.3208330","url":null,"abstract":"Gaze interaction with telerobots is a new opportunity for wheelchair users with severe motor disabilities. We present a video showing how head-mounted displays (HMD) with gaze tracking can be used to monitor a robot that carries a 360° video camera and a microphone. Our interface supports autonomous driving via way-points on a map, along with gaze-controlled steering and gaze typing. It is implemented with Unity, which communicates with the Robot Operating System (ROS).","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134145198","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
Comparison of mapping algorithms for implicit calibration using probable fixation targets 使用可能固定目标的隐式校准映射算法的比较
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204529
P. Kasprowski, Katarzyna Harężlak
{"title":"Comparison of mapping algorithms for implicit calibration using probable fixation targets","authors":"P. Kasprowski, Katarzyna Harężlak","doi":"10.1145/3204493.3204529","DOIUrl":"https://doi.org/10.1145/3204493.3204529","url":null,"abstract":"With growing access to cheap low end eye trackers using simple web cameras, there is also a growing demand on easy and fast usage of this devices by untrained and unsupervised end users. For such users the necessity to calibrate the eye tracker prior to its first usage is often perceived as obtrusive and inconvenient. In the same time perfect accuracy is not necessary for many commercial applications. Therefore, the idea of implicit calibration attracts more and more attention. Algorithms for implicit calibration are able to calibrate the device without any active collaboration with users. Especially, a real time implicit calibration, that is able to calibrate a device on-the-fly, while a person uses an eye tracker, seems to be a reasonable solution to the aforementioned problems. The paper presents examples of implicit calibration algorithms (including their real time versions) based on the idea of probable fixation targets (PFT). The algorithms were tested during a free viewing experiment and compared to the state of the art PFT based algorithm and explicit calibration results.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134274016","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Rapid alternating saccade training 快速交替扫视训练
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204588
Brent D. Parsons, R. Ivry
{"title":"Rapid alternating saccade training","authors":"Brent D. Parsons, R. Ivry","doi":"10.1145/3204493.3204588","DOIUrl":"https://doi.org/10.1145/3204493.3204588","url":null,"abstract":"While individual eye movement characteristics are remarkably stable, experiments on saccadic spatial adaptation indicate that oculomotor learning is possible. To further investigate saccadic learning, participants received veridical feedback on saccade rate while making sequential saccades as quickly as possible between two horizontal targets. Over the course of five days, with just ten minutes of training per day, participants were able to significantly increase the rate of sequential saccades. This occurred through both a reduction in dwell duration and to changes in secondary saccade characteristics. There was no concomitant change in participant's accuracy or precision. The learning was retained following the training and generalized to saccades of different directions, and to reaction time measures during a delayed saccade task. The study provides evidence for a novel form of saccadic learning with applicability in a number of domains.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130524101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Contour-guided gaze gestures: using object contours as visual guidance for triggering interactions 轮廓引导凝视手势:使用物体轮廓作为触发交互的视觉引导
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204530
Florian Jungwirth, Michael Haslgrübler, A. Ferscha
{"title":"Contour-guided gaze gestures: using object contours as visual guidance for triggering interactions","authors":"Florian Jungwirth, Michael Haslgrübler, A. Ferscha","doi":"10.1145/3204493.3204530","DOIUrl":"https://doi.org/10.1145/3204493.3204530","url":null,"abstract":"The eyes are an interesting modality for pervasive interactions, though their applicability for mobile scenarios is restricted by several issues so far. In this paper, we propose the idea of contour-guided gaze gestures, which overcome former constraints, like the need for calibration, by relying on unnatural and relative eye movements, as users trace the contours of objects in order to trigger an interaction. The interaction concept and the system design are described, along with two user studies, that demonstrate the method's applicability. It is shown that users were able to trace object contours to trigger actions from various positions on multiple different objects. It is further determined, that the proposed method is an easy to learn, hands-free interaction technique, that is robust against false positive activations. Results highlight low demand values and show that the method holds potential for further exploration, but also reveal areas for refinement.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131363946","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Investigating the multicausality of processing speed deficits across developmental disorders with eye tracking and EEG: extended abstract 利用眼动追踪和脑电图研究发育障碍中加工速度缺陷的多原因:扩展摘要
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3207417
S. Dziemian, N. Langer
{"title":"Investigating the multicausality of processing speed deficits across developmental disorders with eye tracking and EEG: extended abstract","authors":"S. Dziemian, N. Langer","doi":"10.1145/3204493.3207417","DOIUrl":"https://doi.org/10.1145/3204493.3207417","url":null,"abstract":"Neuropsychological tests inform about performance differences in cognitive functions but they typically tell little about the causes for these differences. Here, we propose a project which builds upon a recently developed novel multimodal neuroscientific approach of simultanous eye-tracking and EEG measurements to provide insights into diverse causes of performance differences in the Symbol Search Test (SST). Using a unique large dataset we plan to investigate the causes for performance differences in the SST in healthy and clinically diagnosed children and adolescents. Firstly, we aim to investigate how causes for differences in performance in the SST evolve over age in healthy, typically developing children. With this we plan to dissect aging effects from effects that are specific to developmental neuropsychiatric disorders. Secondly, we will include subjects with deficient performance to investigate different causes for bad performance to identify data-driven subgroups of poor performers.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123340928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信