Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications最新文献

筛选
英文 中文
A gaze-based experimenter platform for designing and evaluating adaptive interventions in information visualizations 基于注视的信息可视化自适应干预设计与评估实验平台
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3322502
Sébastien Lallé, C. Conati, Dereck Toker
{"title":"A gaze-based experimenter platform for designing and evaluating adaptive interventions in information visualizations","authors":"Sébastien Lallé, C. Conati, Dereck Toker","doi":"10.1145/3314111.3322502","DOIUrl":"https://doi.org/10.1145/3314111.3322502","url":null,"abstract":"We present an experimenter platform for designing and evaluating user-adaptive support in information visualizations. Specifically, this platform leverages eye-tracking data in real time to deliver adaptive support in visualizations based on the user's attentional patterns and individual needs. We describe the main functionalities of this platform, and show an application to support processing of textual documents with embedded bar charts, by dynamically providing highlighting in the charts to guide a user's attention to the relevant information.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129928735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Factors influencing dwell time during source code reading: a large-scale replication experiment 影响源代码读取停留时间的因素:一个大规模的复制实验
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319833
Cole S. Peterson, Nahla J. Abid, Corey A. Bryant, Jonathan I. Maletic, Bonita Sharif
{"title":"Factors influencing dwell time during source code reading: a large-scale replication experiment","authors":"Cole S. Peterson, Nahla J. Abid, Corey A. Bryant, Jonathan I. Maletic, Bonita Sharif","doi":"10.1145/3314111.3319833","DOIUrl":"https://doi.org/10.1145/3314111.3319833","url":null,"abstract":"The paper partially replicates and extends a previous study by Busjahn et al. [4] on the factors influencing dwell time during source code reading, where source code element type and frequency of gaze visits are studied as factors. Unlike the previous study, this study focuses on analyzing eye movement data in large open source Java projects. Five experts and thirteen novices participated in the study where the main task is to summarize methods. The results examine semantic line-level information that developers view during summarization. We find no correlation between the line length and the total duration of time spent looking on the line even though it exists between a token's length and the total fixation time on the token reported in prior work. The first fixations inside a method are more likely to be on a method's signature, a variable declaration, or an assignment compared to the other fixations inside a method. In addition, it is found that smaller methods tend to have shorter overall fixation duration for the entire method, but have significantly longer duration per line in the method. The analysis provides insights into how source code's unique characteristics can help in building more robust methods for analyzing eye movements in source code and overall in building theories to support program comprehension on realistic tasks.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"302 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122417941","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Eye-tracking based fatigue and cognitive assessment: doctoral symposium, extended abstract 基于眼动追踪的疲劳和认知评估:博士研讨会,扩展摘要
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3322867
Tanya Bafna, J. P. Hansen
{"title":"Eye-tracking based fatigue and cognitive assessment: doctoral symposium, extended abstract","authors":"Tanya Bafna, J. P. Hansen","doi":"10.1145/3314111.3322867","DOIUrl":"https://doi.org/10.1145/3314111.3322867","url":null,"abstract":"Fatigue detection, monitoring and management is important and needs to be accommodated in the busy lifestyles that many people have these days. It may have an impact on the physical as well as the emotional health of the individuals. Detection of fatigue is the first step towards its management. With eye-tracking software using cameras, and being included in the laptops and smartphones, it now has the potential to become quite ubiquitous. This extended abstract describes my PhD project for fatigue detection using eye-tracking measures while gaze typing. The steps taken and experiments conducted upto now are presented, with an outline of the future plans. The principal use-case will be to provide the service of fatigue detection for people with neurological disorders, who use eye-tracking for alternative communications.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128302675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Just gaze and wave: exploring the use of gaze and gestures for shoulder-surfing resilient authentication 只是凝视和波浪:探索使用凝视和手势的肩部冲浪弹性认证
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3319837
Yasmeen Abdrabou, M. Khamis, Rana Mohamed Eisa, Sherif Ismail, Amr Elmougy
{"title":"Just gaze and wave: exploring the use of gaze and gestures for shoulder-surfing resilient authentication","authors":"Yasmeen Abdrabou, M. Khamis, Rana Mohamed Eisa, Sherif Ismail, Amr Elmougy","doi":"10.1145/3314111.3319837","DOIUrl":"https://doi.org/10.1145/3314111.3319837","url":null,"abstract":"Eye-gaze and mid-air gestures are promising for resisting various types of side-channel attacks during authentication. However, to date, a comparison of the different authentication modalities is missing. We investigate multiple authentication mechanisms that leverage gestures, eye gaze, and a multimodal combination of them and study their resilience to shoulder surfing. To this end, we report on our implementation of three schemes and results from usability and security evaluations where we also experimented with fixed and randomized layouts. We found that the gaze-based approach outperforms the other schemes in terms of input time, error rate, perceived workload, and resistance to observation attacks, and that randomizing the layout does not improve observation resistance enough to warrant the reduced usability. Our work further underlines the significance of replicating previous eye tracking studies using today's sensors as we show significant improvement over similar previously introduced gaze-based authentication systems.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"109 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133367138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 33
Encodji
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3323074
Wolfgang Fuhl, Efe Bozkir, Benedikt W. Hosp, Nora Castner, David Geisler, Thiago Santini, Enkelejda Kasneci
{"title":"Encodji","authors":"Wolfgang Fuhl, Efe Bozkir, Benedikt W. Hosp, Nora Castner, David Geisler, Thiago Santini, Enkelejda Kasneci","doi":"10.1145/3314111.3323074","DOIUrl":"https://doi.org/10.1145/3314111.3323074","url":null,"abstract":"To this day, a variety of information has been obtained from human eye movements, which holds an imense potential to understand and classify cognitive processes and states - e.g., through scanpath classification. In this work, we explore the task of scanpath classification through a combination of unsupervised feature learning and convolutional neural networks. As an amusement factor, we use an Emoji space representation as feature space. This representation is achieved by training generative adversarial networks (GANs) for unpaired scanpath-to-Emoji translation with a cyclic loss. The resulting Emojis are then used to train a convolutional neural network for stimulus prediciton, showing an accuracy improvement of more than five percentual points compared to the same network trained using solely the scanpath data. As a side effect, we also obtain novel unique Emojis representing each unique scanpath. Our goal is to demonstrate the applicability and potential of unsupervised feature learning to scanpath classification in a humorous and entertaining way.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130881461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 27
Clustered eye movement similarity matrices 聚类眼动相似矩阵
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3317958.3319811
Ayush Kumar, Neil Timmermans, Michael Burch, K. Mueller
{"title":"Clustered eye movement similarity matrices","authors":"Ayush Kumar, Neil Timmermans, Michael Burch, K. Mueller","doi":"10.1145/3317958.3319811","DOIUrl":"https://doi.org/10.1145/3317958.3319811","url":null,"abstract":"Eye movements recorded for many study participants are difficult to interpret, in particular when the task is to identify similar scanning strategies over space, time, and participants. In this paper we describe an approach in which we first compare scanpaths, not only based on Jaccard (JD) and bounding box (BB) similarities, but also on more complex approaches like longest common subsequence (LCS), Frechet distance (FD), dynamic time warping (DTW), and edit distance (ED). The results of these algorithms generate a weighted comparison matrix while each entry encodes the pairwise participant scanpath comparison strength. To better identify participant groups of similar eye movement behavior we reorder this matrix by hierarchical clustering, optimal-leaf ordering, dimensionality reduction, or a spectral approach. The matrix visualization is linked to the original stimulus overplotted with visual attention maps and gaze plots on which typical interactions like temporal, spatial, or participant-based filtering can be applied.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"168 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134128165","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
GeoGCD
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3321488
K. Bektaş, A. Çöltekin, J. Krüger, A. Duchowski, S. Fabrikant
{"title":"GeoGCD","authors":"K. Bektaş, A. Çöltekin, J. Krüger, A. Duchowski, S. Fabrikant","doi":"10.1145/3314111.3321488","DOIUrl":"https://doi.org/10.1145/3314111.3321488","url":null,"abstract":"","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114876868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Pupil diameter as a measure of emotion and sickness in VR 瞳孔直径作为VR中情绪和疾病的衡量标准
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3314111.3322868
Brendan David-John
{"title":"Pupil diameter as a measure of emotion and sickness in VR","authors":"Brendan David-John","doi":"10.1145/3314111.3322868","DOIUrl":"https://doi.org/10.1145/3314111.3322868","url":null,"abstract":"Eye tracking is rapidly becoming popular in consumer technology, including virtual and augmented reality. Eye trackers commonly provide an estimate of gaze location, and pupil diameter. Pupil diameter is useful for interactive systems, as it provides means to estimate cognitive load, stress, and emotional state. However, there are several roadblocks that limit the use of pupil diameter. In VR HMDs there are a lack of models that account for stereoscopic viewing and the increased brightness of near eye displays. Existing work has shown correlations between pupil diameter and emotion, but have not been extended to VR environments. The scope of this work is to bridge the gap between existing research on emotion and pupil diameter to VR, while also attempting to use pupillary data to tackle the problem of simulator sickness in VR.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"48 8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121197399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Visually comparing eye movements over space and time 视觉上比较眼睛在空间和时间上的运动
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3317958.3319810
Ayush Kumar, Michael Burch, K. Mueller
{"title":"Visually comparing eye movements over space and time","authors":"Ayush Kumar, Michael Burch, K. Mueller","doi":"10.1145/3317958.3319810","DOIUrl":"https://doi.org/10.1145/3317958.3319810","url":null,"abstract":"Analyzing and visualizing eye movement data can provide useful insights into the connectivities and linkings of points and areas of interest (POIs and AOIs). Those typically time-varying relations can give hints about applied visual scanning strategies by either individual or many eye tracked people. However, the challenging issue with this kind of data is its spatio-temporal nature requiring a good visual encoding in order to first, achieve a scalable overview-based diagram, and second, to derive static or dynamic patterns that might correspond to certain comparable visual scanning strategies. To reliably identify the dynamic strategies we describe a visualization technique that generates a more linear representation of the spatio-temporal scan paths. This is achieved by applying different visual encodings of the spatial dimensions that typically build a limitation for an eye movement data visualization causing visual clutter effects, overdraw, and occlusions while the temporal dimension is depicted as a linear time axis. The presented interactive visualization concept is composed of three linked views depicting spatial, metrics-related, as well as distance-based aspects over time.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122359847","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
SaccadeMachine
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications Pub Date : 2019-06-25 DOI: 10.1145/3317956.3318148
D. Mardanbegi, T. Wilcockson, P. Sawyer, Hans Gellersen, T. Crawford
{"title":"SaccadeMachine","authors":"D. Mardanbegi, T. Wilcockson, P. Sawyer, Hans Gellersen, T. Crawford","doi":"10.1145/3317956.3318148","DOIUrl":"https://doi.org/10.1145/3317956.3318148","url":null,"abstract":"Various types of saccadic paradigms, in particular, Prosaccade and Antisaccade tests are widely used in Pathophysiology and Psychology. Despite been widely used, there has not been a standard tool for processing and analyzing the eye tracking data obtained from saccade tests. We describe an open-source software for extracting and analyzing the eye movement data of different types of saccade tests that can be used to extract and compare participants' performance and various task-related measures across participants. We further demonstrate the utility of the software by using it to analyze the data from an antisaccade, and a recent distractor experiment.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116007025","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信