Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications最新文献

筛选
英文 中文
Sensitivity to natural 3D image transformations during eye movements 敏感的自然3D图像转换过程中的眼球运动
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204583
Maryam Keyvanara, R. Allison
{"title":"Sensitivity to natural 3D image transformations during eye movements","authors":"Maryam Keyvanara, R. Allison","doi":"10.1145/3204493.3204583","DOIUrl":"https://doi.org/10.1145/3204493.3204583","url":null,"abstract":"The saccadic suppression effect, in which visual sensitivity is reduced significantly during saccades, has been suggested as a mechanism for masking graphic updates in a 3D virtual environment. In this study, we investigate whether the degree of saccadic suppression depends on the type of image change, particularly between different natural 3D scene transformations. The user observed 3D scenes and made a horizontal saccade in response to the displacement of a target object in the scene. During this saccade the entire scene translated or rotated. We studied six directions of transformation corresponding to the canonical directions for the six degrees of freedom. Following each trial, the user made a forced-choice indication of direction of the scene change. Results show that during horizontal saccades, the most recognizable changes were rotations along the roll axis.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126235375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Mobile consumer shopping journey in fashion retail: eye tracking mobile apps and websites 时尚零售中的移动消费者购物之旅:眼动追踪移动应用程序和网站
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208335
Zofija Tupikovskaja-Omovie, D. Tyler
{"title":"Mobile consumer shopping journey in fashion retail: eye tracking mobile apps and websites","authors":"Zofija Tupikovskaja-Omovie, D. Tyler","doi":"10.1145/3204493.3208335","DOIUrl":"https://doi.org/10.1145/3204493.3208335","url":null,"abstract":"Despite the rapid adoption of smartphones among fashion consumers, their dissatisfaction with retailers' mobile apps and websites also increases. This suggests that understanding how mobile consumers use smartphones for shopping is important in developing digital shopping platforms fulfilling consumers' expectations. Research to date has not focused on eye tracking consumer shopping behavior using smartphones. For this research, we employed mobile eye tracking experiments in order to develop unique shopping journeys for each fashion consumer accounting for differences and similarities in their behavior. Based on scan path visualizations and shopping journeys we developed a precise account about the areas the majority of fashion consumers look at when browsing and inspecting product pages. Based on the findings, we identified mobile consumers' behaviour patterns, usability issues of the mobile channel and established what features the mobile retail channel needs to have to satisfy fashion consumers' needs by offering pleasing customer user experiences.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114898412","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Revisiting data normalization for appearance-based gaze estimation 重新审视基于外观的注视估计的数据规范化
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204548
Xucong Zhang, Yusuke Sugano, A. Bulling
{"title":"Revisiting data normalization for appearance-based gaze estimation","authors":"Xucong Zhang, Yusuke Sugano, A. Bulling","doi":"10.1145/3204493.3204548","DOIUrl":"https://doi.org/10.1145/3204493.3204548","url":null,"abstract":"Appearance-based gaze estimation is promising for unconstrained real-world settings, but the significant variability in head pose and user-camera distance poses significant challenges for training generic gaze estimators. Data normalization was proposed to cancel out this geometric variability by mapping input images and gaze labels to a normalized space. Although used successfully in prior works, the role and importance of data normalization remains unclear. To fill this gap, we study data normalization for the first time using principled evaluations on both simulated and real data. We propose a modification to the current data normalization formulation by removing the scaling factor and show that our new formulation performs significantly better (between 9.5% and 32.7%) in the different evaluation settings. Using images synthesized from a 3D face model, we demonstrate the benefit of data normalization for the efficiency of the model training. Experiments on real-world images confirm the advantages of data normalization in terms of gaze estimation performance.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116795551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 80
Error-aware gaze-based interfaces for robust mobile gaze interaction 基于错误感知的注视界面,实现健壮的移动注视交互
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204536
Michael Barz, Florian Daiber, Daniel Sonntag, A. Bulling
{"title":"Error-aware gaze-based interfaces for robust mobile gaze interaction","authors":"Michael Barz, Florian Daiber, Daniel Sonntag, A. Bulling","doi":"10.1145/3204493.3204536","DOIUrl":"https://doi.org/10.1145/3204493.3204536","url":null,"abstract":"Gaze estimation error can severely hamper usability and performance of mobile gaze-based interfaces given that the error varies constantly for different interaction positions. In this work, we explore error-aware gaze-based interfaces that estimate and adapt to gaze estimation error on-the-fly. We implement a sample error-aware user interface for gaze-based selection and different error compensation methods: a naïve approach that increases component size directly proportional to the absolute error, a recent model by Feit et al. that is based on the two-dimensional error distribution, and a novel predictive model that shifts gaze by a directional error estimate. We evaluate these models in a 12-participant user study and show that our predictive model significantly outperforms the others in terms of selection rate, particularly for small gaze targets. These results underline both the feasibility and potential of next generation error-aware gaze-based user interfaces.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117318769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 27
Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets 基于注视目标视觉相似性的头戴式眼动追踪注视检测
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204538
Julian Steil, Michael Xuelin Huang, A. Bulling
{"title":"Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets","authors":"Julian Steil, Michael Xuelin Huang, A. Bulling","doi":"10.1145/3204493.3204538","DOIUrl":"https://doi.org/10.1145/3204493.3204538","url":null,"abstract":"Fixations are widely analysed in human vision, gaze-based interaction, and experimental psychology research. However, robust fixation detection in mobile settings is profoundly challenging given the prevalence of user and gaze target motion. These movements feign a shift in gaze estimates in the frame of reference defined by the eye tracker's scene camera. To address this challenge, we present a novel fixation detection method for head-mounted eye trackers. Our method exploits that, independent of user or gaze target motion, target appearance remains about the same during a fixation. It extracts image information from small regions around the current gaze position and analyses the appearance similarity of these gaze patches across video frames to detect fixations. We evaluate our method using fine-grained fixation annotations on a five-participant indoor dataset (MPIIEgoFixation) with more than 2,300 fixations in total. Our method outperforms commonly used velocity- and dispersion-based algorithms, which highlights its significant potential to analyse scene image information for eye movement detection.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124807199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 31
Hidden pursuits: evaluating gaze-selection via pursuits when the stimuli's trajectory is partially hidden 隐藏追求:当刺激物的轨迹部分隐藏时,通过追求来评估目光选择
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204569
Thomas Mattusch, Mahsa Mirzamohammad, M. Khamis, A. Bulling, Florian Alt
{"title":"Hidden pursuits: evaluating gaze-selection via pursuits when the stimuli's trajectory is partially hidden","authors":"Thomas Mattusch, Mahsa Mirzamohammad, M. Khamis, A. Bulling, Florian Alt","doi":"10.1145/3204493.3204569","DOIUrl":"https://doi.org/10.1145/3204493.3204569","url":null,"abstract":"The idea behind gaze interaction using Pursuits is to leverage the human's smooth pursuit eye movements performed when following moving targets. However, humans can also anticipate where a moving target would reappear if it temporarily hides from their view. In this work, we investigate how well users can select targets using Pursuits in cases where the target's trajectory is partially invisible (HiddenPursuits): e.g., can users select a moving target that temporarily hides behind another object? Although HiddenPursuits was not studied in the context of interaction before, understanding how well users can perform HiddenPursuits presents numerous opportunities, particularly for small interfaces where a target's trajectory can cover area outside of the screen. We found that users can still select targets quickly via Pursuits even if their trajectory is up to 50% hidden, and at the expense of longer selection times when the hidden portion is larger. We discuss how gaze-based interfaces can leverage HiddenPursuits for an improved user experience.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128304104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Deep learning vs. manual annotation of eye movements 深度学习与手动注释眼球运动
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208346
Mikhail Startsev, I. Agtzidis, M. Dorr
{"title":"Deep learning vs. manual annotation of eye movements","authors":"Mikhail Startsev, I. Agtzidis, M. Dorr","doi":"10.1145/3204493.3208346","DOIUrl":"https://doi.org/10.1145/3204493.3208346","url":null,"abstract":"Deep Learning models have revolutionized many research fields already. However, the raw eye movement data is still typically processed into discrete events via threshold-based algorithms or manual labelling. In this work, we describe a compact 1D CNN model, which we combined with BLSTM to achieve end-to-end sequence-to-sequence learning. We discuss the acquisition process for the ground truth that we use, as well as the performance of our approach, in comparison to various literature models and manual raters. Our deep method demonstrates superior performance, which brings us closer to human-level labelling quality.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"31 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120998896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Robustness of metrics used for scanpath comparison 用于扫描路径比较的度量的鲁棒性
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204580
F. Děchtěrenko, J. Lukavský
{"title":"Robustness of metrics used for scanpath comparison","authors":"F. Děchtěrenko, J. Lukavský","doi":"10.1145/3204493.3204580","DOIUrl":"https://doi.org/10.1145/3204493.3204580","url":null,"abstract":"In every quantitative eye tracking research study, researchers need to compare eye movements between subjects or conditions. For both static and dynamic tasks, there is a variety of metrics that could serve this purpose. It is important to explore the robustness of the metrics with respect to artificial noise. For dynamic tasks, where eye movement data is represented as scanpaths, there are currently no studies regarding the robustness of the metrics. In this study, we explored properties of five metrics (Levenshtein distance, correlation distance, Fréchet distance, mean and median distance) used for comparison of scanpaths. We systematically added noise by applying three transformations to the scanpaths: translation, rotation, and scaling. For each metric, we computed baseline similarity for two random scanpaths and explored the metrics' sensitivity. Our results allow other researchers to convert results between studies.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"286 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116453743","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
The eye of the typer: a benchmark and analysis of gaze behavior during typing 打字者的眼睛:打字时注视行为的基准和分析
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204552
Alexandra Papoutsaki, Aaron Gokaslan, J. Tompkin, Yuze He, Jeff Huang
{"title":"The eye of the typer: a benchmark and analysis of gaze behavior during typing","authors":"Alexandra Papoutsaki, Aaron Gokaslan, J. Tompkin, Yuze He, Jeff Huang","doi":"10.1145/3204493.3204552","DOIUrl":"https://doi.org/10.1145/3204493.3204552","url":null,"abstract":"We examine the relationship between eye gaze and typing, focusing on the differences between touch and non-touch typists. To enable typing-based research, we created a 51-participant benchmark dataset for user input across multiple tasks, including user input data, screen recordings, webcam video of the participant's face, and eye tracking positions. There are patterns of eye movements that differ between the two types of typists, representing glances at the keyboard, which can be used to identify touch-.typed strokes with 92% accuracy. Then, we relate eye gaze with cursor activity, aligning both pointing and typing to eye gaze. One demonstrative application of the work is in extending WebGazer, a real-time web-browser-based webcam eye tracker. We show that incorporating typing behavior as a secondary signal improves eye tracking accuracy by 16% for touch typists, and 8% for non-touch typists.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124394534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Automatic detection and inhibition of neutral and emotional stimuli in post-traumatic stress disorder: an eye-tracking study: eye-tracking data of an original antisaccade task 创伤后应激障碍中性和情绪刺激的自动检测和抑制:一项眼动追踪研究:一项原始反扫视任务的眼动追踪数据
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3207419
Wivine Blekić, M. Rossignol
{"title":"Automatic detection and inhibition of neutral and emotional stimuli in post-traumatic stress disorder: an eye-tracking study: eye-tracking data of an original antisaccade task","authors":"Wivine Blekić, M. Rossignol","doi":"10.1145/3204493.3207419","DOIUrl":"https://doi.org/10.1145/3204493.3207419","url":null,"abstract":"This research project addresses the understanding of attentional biases post-traumatic stress disorder (PTSD). This psychiatric condition is mainly characterized by symptoms of intrusion (flashbacks), avoidance, alteration of arousal and reactivity (hypervigilance), and negative mood and cognitions persisting one month after the exposure of a traumatic event [American Psychiatric Association 2013]. Clinical observations as well as empirical research highlighted the symptom of hypervigilance as being central in the PTSD symptomatology, considering that other clinical features could be maintained by it [Ehlers and Clark 2000]. Attentional Control theory has described the hypervigilance in anxious disorders as the co-occurrence of two cognitive processes : an enhanced detection of threatening information followed by difficulties to inhibit their processing [Eysenck et al. 2007]. Nevertheless, attentional control theory has never been applied to PTSD. This project aims at providing cognitive evidence of hypervigilance symptoms in PTSD using eye-tracking during the realization of reliable Miyake tasks [Eysenck and Derakshan 2011]. Therefore, our first aim is to model the co-occurring processes of hypervigilance using eye-tracking technology. Indeed, behavioral measures (as reaction time) do not allow a clear representation of cognitive processes occurring subconsciously in a few milliseconds [Felmingham 2016]. Therefore, eye-tracking technology is essential in our studies. Secondly, we aim to analyze the differential impact of trauma-related stimulus vs negative stimuli on PTSD patients, by conducting scan paths following both of those stimuli presentation. This research project is divided into four studies. The first one will be described is this doctoral symposium.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126619394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信