ACM Symposium on Eye Tracking Research and Applications最新文献

筛选
英文 中文
Are They Actually Looking? Identifying Smartphones Shoulder Surfing Through Gaze Estimation 他们真的在看吗?通过凝视估计识别智能手机肩部冲浪
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379157.3391422
Alia Saad, Dina Hisham Elkafrawy, Slim Abdennadher, Stefan Schneegass
{"title":"Are They Actually Looking? Identifying Smartphones Shoulder Surfing Through Gaze Estimation","authors":"Alia Saad, Dina Hisham Elkafrawy, Slim Abdennadher, Stefan Schneegass","doi":"10.1145/3379157.3391422","DOIUrl":"https://doi.org/10.1145/3379157.3391422","url":null,"abstract":"Mobile devices have evolved to be a crucial part of our everyday lives. However, they are subject to different types of user-centered attacks such as shoulder surfing attacks. Previous work focused on notifying the user with a potential shoulder surfer if an extra face is detected. Although it is a successful approach, it eliminated the possibility that the alleged attacker is just standing and not looking at the user’s device. In this work, we investigate estimating the gaze of potential attackers in order to verify if they are indeed looking at the user’s phone.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134401078","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Cognitive Load during Eye-typing 眼打字过程中的认知负荷
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379155.3391333
Tanya Bafna, J. P. Hansen, Per Baekgaard
{"title":"Cognitive Load during Eye-typing","authors":"Tanya Bafna, J. P. Hansen, Per Baekgaard","doi":"10.1145/3379155.3391333","DOIUrl":"https://doi.org/10.1145/3379155.3391333","url":null,"abstract":"In this paper, we have measured cognitive load during an interactive eye-tracking task. Eye-typing was chosen as the task, because of its familiarity, ubiquitousness and ease. Experiments with 18 participants, where they memorized and eye-typed easy and difficult sentences over four days, were used to compare the difficulty levels of the tasks using subjective scores and eye-metrics like blink duration, frequency and interval and pupil dilation were explored, in addition to performance measures like typing speed, error rate and attended but not selected rate. Typing performance lowered with increased task difficulty, while blink frequency, duration and interval were higher for the difficult tasks. Pupil dilation indicated the memorization process, but did not demonstrate a difference between easy and difficult tasks.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134486171","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Sequence Models in Eye Tracking: Predicting Pupil Diameter During Learning 眼动追踪中的序列模型:学习过程中瞳孔直径的预测
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379157.3391653
Sharath C. Koorathota, Kaveri A. Thakoor, Patrick Adelman, Yaoli Mao, Xueqing Liu, P. Sajda
{"title":"Sequence Models in Eye Tracking: Predicting Pupil Diameter During Learning","authors":"Sharath C. Koorathota, Kaveri A. Thakoor, Patrick Adelman, Yaoli Mao, Xueqing Liu, P. Sajda","doi":"10.1145/3379157.3391653","DOIUrl":"https://doi.org/10.1145/3379157.3391653","url":null,"abstract":"A deep learning framework for predicting pupil diameter using eye tracking data is described. Using a variety of input, such as fixation positions, durations, saccades and blink-related information, we assessed the performance of a sequence model in predicting future pupil diameter in a student population as they watched educational videos in a controlled setting. Through assessing student performance on a post-viewing test, we report that deep learning sequence models may be useful for separating components of pupil responses that are linked to luminance and accommodation from those that are linked to cognition and arousal.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124107388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Neural networks for optical vector and eye ball parameter estimation 光学矢量和眼球参数估计的神经网络
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379156.3391346
Wolfgang Fuhl, Hong Gao, Enkelejda Kasneci
{"title":"Neural networks for optical vector and eye ball parameter estimation","authors":"Wolfgang Fuhl, Hong Gao, Enkelejda Kasneci","doi":"10.1145/3379156.3391346","DOIUrl":"https://doi.org/10.1145/3379156.3391346","url":null,"abstract":"In this work we evaluate neural networks, support vector machines and decision trees for the regression of the center of the eyeball and the optical vector based on the pupil ellipse. In the evaluation we analyze single ellipses as well as window-based approaches as input. Comparisons are made regarding accuracy and runtime. The evaluation gives an overview of the general expected accuracy with different models and amounts of input ellipses. A simulator was implemented for the generation of the training and evaluation data. For a visual evaluation and to push the state of the art in optical vector estimation, the best model was applied to real data. This real data came from public data sets in which the ellipse is already annotated by an algorithm. The optical vectors on real data and the generator are made publicly available. Link to the generator and models.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114904722","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 25
Synopticon: Sensor Fusion for Automated Gaze Analysis 自动注视分析的传感器融合
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379157.3391986
Michael Hildebrandt, Jens-Patrick Langstrand, H. Nguyen
{"title":"Synopticon: Sensor Fusion for Automated Gaze Analysis","authors":"Michael Hildebrandt, Jens-Patrick Langstrand, H. Nguyen","doi":"10.1145/3379157.3391986","DOIUrl":"https://doi.org/10.1145/3379157.3391986","url":null,"abstract":"This demonstration presents Synopticon, an open-source software system for automatic, real-time gaze object detection for mobile eye tracking. The system merges gaze data from eye tracking glasses with position data from a motion capture system and projects the resulting gaze vector onto a 3D model of the environment.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126978752","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Demo of a Visual Gaze Analysis System for Virtual Board Games 虚拟桌游的视觉注视分析系统演示
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379157.3391985
T. Munz, Noel Schaefer, Tanja Blascheck, K. Kurzhals, E. Zhang, D. Weiskopf
{"title":"Demo of a Visual Gaze Analysis System for Virtual Board Games","authors":"T. Munz, Noel Schaefer, Tanja Blascheck, K. Kurzhals, E. Zhang, D. Weiskopf","doi":"10.1145/3379157.3391985","DOIUrl":"https://doi.org/10.1145/3379157.3391985","url":null,"abstract":"We demonstrate a system for the visual analysis of eye movement data of competitive and collaborative virtual board games two persons play. Our approach uses methods to temporally synchronize and spatially register gaze and mouse recordings from two possibly different eye tracking devices. Analysts can then examine such fused data with a combination of visualizations. We demonstrate our methods for the competitive game Go, which is especially complex for the analysis of strategies of individual players.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127186123","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Gaze - grabber distance in expert and novice forest machine operators: the effects of automatic boom control 林业机械操作工与新手的视线距离:对自动臂架控制的影响
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379157.3391414
Jani Koskinen, R. Bednarik
{"title":"Gaze - grabber distance in expert and novice forest machine operators: the effects of automatic boom control","authors":"Jani Koskinen, R. Bednarik","doi":"10.1145/3379157.3391414","DOIUrl":"https://doi.org/10.1145/3379157.3391414","url":null,"abstract":"Eye-hand coordination is a central skill in both everyday and expert visuo-motor tasks. In forest machine cockpits during harvest, the operators need to perform the eye-hand coordination and spatial navigation effectively to control the boom of the forwarder smoothly and quickly to achieve high performance. Because it is largely unknown how this skill is acquired, we conducted a first eye-tracking study of its kind to uncover the strategies expert and novice operators use. In an authentic training situation, both groups used an industry standard machine with- and without intelligent boom control support, and we measured their gaze and boom control strategies.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124766477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Analyzing Transferability of Happiness Detection via Gaze Tracking in Multimedia Applications 基于注视跟踪的多媒体应用幸福感检测可移植性分析
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379157.3391655
David Bethge, L. Chuang, T. Große-Puppendahl
{"title":"Analyzing Transferability of Happiness Detection via Gaze Tracking in Multimedia Applications","authors":"David Bethge, L. Chuang, T. Große-Puppendahl","doi":"10.1145/3379157.3391655","DOIUrl":"https://doi.org/10.1145/3379157.3391655","url":null,"abstract":"How are strong positive affective states related to eye-tracking features and how can they be used to appropriately enhance well-being in multimedia consumption? In this paper, we propose a robust classification algorithm for predicting strong happy emotions from a large set of features acquired from wearable eye-tracking glasses. We evaluate the potential transferability across subjects and provide a model-agnostic interpretable feature importance metric. Our proposed algorithm achieves a true-positive-rate of 70% while keeping a low false-positive-rate of 10% with extracted features of the pupil diameter as most important features.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123471746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Implications of Eye Tracking Research to Cinematic Virtual Reality 眼动追踪研究对电影虚拟现实的启示
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379157.3391658
Sylvia Rothe, L. Chuang
{"title":"Implications of Eye Tracking Research to Cinematic Virtual Reality","authors":"Sylvia Rothe, L. Chuang","doi":"10.1145/3379157.3391658","DOIUrl":"https://doi.org/10.1145/3379157.3391658","url":null,"abstract":"While watching omnidirectional movies via head-mounted displays the viewer has an immersive viewing experience. Turning the head and looking around is a natural input technique to choose the visible part of the movie. For realizing scene changes depending on the viewing direction and for implementing non-linear story structures in cinematic virtual reality (CVR), selection methods are required to select the story branches. The input device should not disturb the viewing experience and the viewer should not be primarily aware of it. Eye- and head-based methods do not need additional devices and seem to be especially suitable. We investigate several techniques by using an own tool for analysing head and eye tracking data in CVR.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131558000","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Spontaneous Gaze Gesture Interaction in the Presence of Noises and Various Types of Eye Movements 在噪音和各种类型的眼球运动存在下的自发注视手势互动
ACM Symposium on Eye Tracking Research and Applications Pub Date : 2020-06-02 DOI: 10.1145/3379156.3391363
S. Wibirama, Suatmi Murnani, N. A. Setiawan
{"title":"Spontaneous Gaze Gesture Interaction in the Presence of Noises and Various Types of Eye Movements","authors":"S. Wibirama, Suatmi Murnani, N. A. Setiawan","doi":"10.1145/3379156.3391363","DOIUrl":"https://doi.org/10.1145/3379156.3391363","url":null,"abstract":"Gaze gesture is a desirable technique for a spontaneous and pervasive gaze interaction due to its insensitivity to spatial accuracy. Unfortunately, gaze gesture-based object selection utilizing correlation coefficient is prone to a low object selection accuracy due to presence of noises. In addition, effect of various types of eye movements that present in gaze gesture-based object selection has not been tackled properly. To overcome these problems, we propose a denoising method for gaze gesture-based object selection using First Order IIR Filter and an event detection method based on the Hidden Markov Model. The experimental results show that the proposed method yielded the best object selection accuracy of . The result suggests that a spontaneous gaze gesture-based object selection is feasible to be developed in the presence of noises and various types of eye movements.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126265535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信