Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications最新文献

筛选
英文 中文
Predicting observer's task from eye movement patterns during motion image analysis 在运动图像分析中,通过眼动模式预测观察者的任务
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204575
Jutta Hild, M. Voit, Christian Kühnle, J. Beyerer
{"title":"Predicting observer's task from eye movement patterns during motion image analysis","authors":"Jutta Hild, M. Voit, Christian Kühnle, J. Beyerer","doi":"10.1145/3204493.3204575","DOIUrl":"https://doi.org/10.1145/3204493.3204575","url":null,"abstract":"Predicting an observer's tasks from eye movements during several viewing tasks has been investigated by several authors. This contribution adds task prediction from eye movements tasks occurring during motion image analysis: Explore, Observe, Search, and Track. For this purpose, gaze data was recorded from 30 human observers viewing a motion image sequence once under each task. For task decoding, the classification methods Random Forest, LDA, and QDA were used; features were fixation- or saccade-related measures. Best accuracy for prediction of the three tasks Observe, Search, Track from the 4-minute gaze data samples was 83.7% (chance level 33%) using Random Forest. Best accuracy for prediction of all four tasks from the gaze data samples containing the first 30 seconds of viewing was 59.3% (chance level 25%) using LDA. Accuracy decreased significantly for task prediction on small gaze data chunks of 5 and 3 seconds, being 45.3% and 38.0% (chance 25%) for the four tasks, and 52.3% and 47.7% (chance 33%) for the three tasks.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117110212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Correlation between gaze and hovers during decision-making interaction 决策互动中凝视与悬停的相关性
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204567
Pierre Weill-Tessier, Hans-Werner Gellersen
{"title":"Correlation between gaze and hovers during decision-making interaction","authors":"Pierre Weill-Tessier, Hans-Werner Gellersen","doi":"10.1145/3204493.3204567","DOIUrl":"https://doi.org/10.1145/3204493.3204567","url":null,"abstract":"Taps only consist of a small part of the manual input when interacting with touch-enabled surfaces. Indeed, how the hand behaves in the hovering space is informative of what the user intends to do. In this article, we present a data collection related to hand and eye motion. We tailored a kiosk-like system to record participants' gaze and hand movements. We specifically designed a memory game to detect the decision-making process users may face. Our data collection comprises of 177 trials from 71 participants. Based on a hand movement classification, we extracted 16588 hovers. We study the gaze behaviour during hovers, and we found out that the distance between gaze and hand depends on the target's location on the screen. We also showed how indecision can be deducted from this distance.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"2233 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127471020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Virtual reality as a proxy for real-life social attention? 虚拟现实作为现实社会关注的代表?
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3207411
Marius Rubo, M. Gamer
{"title":"Virtual reality as a proxy for real-life social attention?","authors":"Marius Rubo, M. Gamer","doi":"10.1145/3204493.3207411","DOIUrl":"https://doi.org/10.1145/3204493.3207411","url":null,"abstract":"Previous studies found large amounts of overt attention allocated towards human faces when they were presented as images or videos, but a relative avoidance of gaze at conspecifics' faces in real-world situations. We measured gaze behavior in a complex virtual scenario in which a human face and an object were similarily exposed to the participants' view. Gaze at the face was avoided compared to gaze at the object, providing support for the hypothesis that virtual reality scenarios are capable of eliciting modes of information processing comparable to real-world situations.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115189538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Pupil size as an indicator of visual-motor workload and expertise in microsurgical training tasks 瞳孔大小作为视觉运动负荷和显微外科训练任务专长的指标
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204577
R. Bednarik, P. Bartczak, Hana Vrzakova, Jani Koskinen, A. Elomaa, Antti Huotarinen, David Gil de Gómez Pérez, M. Fraunberg
{"title":"Pupil size as an indicator of visual-motor workload and expertise in microsurgical training tasks","authors":"R. Bednarik, P. Bartczak, Hana Vrzakova, Jani Koskinen, A. Elomaa, Antti Huotarinen, David Gil de Gómez Pérez, M. Fraunberg","doi":"10.1145/3204493.3204577","DOIUrl":"https://doi.org/10.1145/3204493.3204577","url":null,"abstract":"Pupillary responses have been for long linked to cognitive workload in numerous tasks. In this work, we investigate the role of pupil dilations in the context of microsurgical training, handling of microinstruments and the suturing act in particular. With an eye-tracker embedded on the surgical microscope oculars, eleven medical participants repeated 12 sutures of artificial skin under high magnification. A detailed analysis of pupillary dilations in suture segments revealed that pupillary responses indeed varied not only according to the main suture segments but also in relation to participants' expertise.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122948105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
Wearable eye tracker calibration at your fingertips 可穿戴眼动仪校准在您的指尖
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204592
Mihai Bâce, S. Staal, Gábor Sörös
{"title":"Wearable eye tracker calibration at your fingertips","authors":"Mihai Bâce, S. Staal, Gábor Sörös","doi":"10.1145/3204493.3204592","DOIUrl":"https://doi.org/10.1145/3204493.3204592","url":null,"abstract":"Common calibration techniques for head-mounted eye trackers rely on markers or an additional person to assist with the procedure. This is a tedious process and may even hinder some practical applications. We propose a novel calibration technique which simplifies the initial calibration step for mobile scenarios. To collect the calibration samples, users only have to point with a finger to various locations in the scene. Our vision-based algorithm detects the users' hand and fingertips which indicate the users' point of interest. This eliminates the need for additional assistance or specialized markers. Our approach achieves comparable accuracy to similar marker-based calibration techniques and is the preferred method by users from our study. The implementation is openly available as a plugin for the open-source Pupil eye tracking platform.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128743688","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Dwell time reduction technique using Fitts' law for gaze-based target acquisition 基于Fitts定律的注视目标捕获停留时间减少技术
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204532
Toshiya Isomoto, Toshiyuki Ando, B. Shizuki, Shin Takahashi
{"title":"Dwell time reduction technique using Fitts' law for gaze-based target acquisition","authors":"Toshiya Isomoto, Toshiyuki Ando, B. Shizuki, Shin Takahashi","doi":"10.1145/3204493.3204532","DOIUrl":"https://doi.org/10.1145/3204493.3204532","url":null,"abstract":"We present a dwell time reduction technique for gaze-based target acquisition. We adopt Fitts' Law to achieve the dwell time reduction. Our technique uses both the eye movement time for target acquisition estimated using Fitts' Law (Te) and the actual eye movement time (Ta) for target acquisition; a target is acquired when the difference between Te and Ta is small. First, we investigated the relation between the eye movement for target acquisition and Fitts' Law; the result indicated a correlation of 0.90 after error correction. Then we designed and implemented our technique. Finally, we conducted a user study to investigate the performance of our technique; an average dwell time of 86.7 ms was achieved, with a 10.0% Midas-touch rate.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129090956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
Real-time gaze transition entropy 实时注视转移熵
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208340
Islam Akef Ebeid, J. Gwizdka
{"title":"Real-time gaze transition entropy","authors":"Islam Akef Ebeid, J. Gwizdka","doi":"10.1145/3204493.3208340","DOIUrl":"https://doi.org/10.1145/3204493.3208340","url":null,"abstract":"In this video, we introduce a real-time algorithm that computes gaze transition entropy. This approach can be employed in detecting higher level cognitive states such as situation awareness. We first compute fixations using our real-time version of a well established velocity threshold based algorithm. We then compute the gaze transition entropy for a content independent grid of areas of interest in real-time using an update processing window approach. We test for Markov property after each update to test whether Markov assumption holds. Higher entropy corresponds to increased eye movement and more frequent monitoring of the visual field. In contrast, lower entropy corresponds to fewer eye movements and less frequent monitoring. Based on entropy levels, the system could then alert the user accordingly and plausibly offer an intervention. We developed an example application to demonstrate the use of the online calculation of gaze transition entropy in a practical scenario.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"193 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116307894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Automatic mapping of gaze position coordinates of eye-tracking glasses video on a common static reference image 眼球追踪眼镜视频注视位置坐标在普通静态参考图像上的自动映射
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208331
Adam Bykowski, Szymon Kupiński
{"title":"Automatic mapping of gaze position coordinates of eye-tracking glasses video on a common static reference image","authors":"Adam Bykowski, Szymon Kupiński","doi":"10.1145/3204493.3208331","DOIUrl":"https://doi.org/10.1145/3204493.3208331","url":null,"abstract":"This paper describes a method for automatic semantic gaze mapping from video obtained by eye-tracking glasses to a common reference image. Image feature detection and description algorithms are utilized to find the position of subsequent video frames and map corresponding gaze coordinates on a common reference image. This process allows aggregate experiment results for further experiment analysis and provides an alternative for manual semantic gaze mapping methods.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126205217","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Development and evaluation of a gaze feedback system integrated into eyetrace 眼动融合注视反馈系统的开发与评价
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204561
Kai Otto, Nora Castner, David Geisler, Enkelejda Kasneci
{"title":"Development and evaluation of a gaze feedback system integrated into eyetrace","authors":"Kai Otto, Nora Castner, David Geisler, Enkelejda Kasneci","doi":"10.1145/3204493.3204561","DOIUrl":"https://doi.org/10.1145/3204493.3204561","url":null,"abstract":"A growing field of studies in eye-tracking is the use of gaze data for realtime feedback to the subject. In this work, we present a software system for such experiments and validate it with a visual search task experiment. This system was integrated into an eye tracking analysis tool. Our aim was to improve subject performance in this task by employing saliency features for gaze guidance. This realtime feedback system can be applicable within many realms, such as learning interventions, computer entertainment, or virtual reality.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121936996","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Anyorbit: orbital navigation in virtual environments with eye-tracking Anyorbit:在虚拟环境中进行眼球追踪的轨道导航
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204555
B. Outram, Yun Suen Pai, Tanner Person, K. Minamizawa, K. Kunze
{"title":"Anyorbit: orbital navigation in virtual environments with eye-tracking","authors":"B. Outram, Yun Suen Pai, Tanner Person, K. Minamizawa, K. Kunze","doi":"10.1145/3204493.3204555","DOIUrl":"https://doi.org/10.1145/3204493.3204555","url":null,"abstract":"Gaze-based interactions promise to be fast, intuitive and effective in controlling virtual and augmented environments. Yet, there is still a lack of usable 3D navigation and observation techniques. In this work: 1) We introduce a highly advantageous orbital navigation technique, AnyOrbit, providing an intuitive and hands-free method of observation in virtual environments that uses eye-tracking to control the orbital center of movement; 2) The versatility of the technique is demonstrated with several control schemes and use-cases in virtual/augmented reality head-mounted-display and desktop setups, including observation of 3D astronomical data and spectator sports.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125761025","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信