2022 Symposium on Eye Tracking Research and Applications最新文献

筛选
英文 中文
Visualizing Instructor’s Gaze Information for Online Video-based Learning: Preliminary Study 在线视频学习中教师注视信息的可视化:初步研究
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3529238
Daun Kim, Jae-Yeop Jeong, Sumin Hong, Namsub Kim, Jin-Woo Jeong
{"title":"Visualizing Instructor’s Gaze Information for Online Video-based Learning: Preliminary Study","authors":"Daun Kim, Jae-Yeop Jeong, Sumin Hong, Namsub Kim, Jin-Woo Jeong","doi":"10.1145/3517031.3529238","DOIUrl":"https://doi.org/10.1145/3517031.3529238","url":null,"abstract":"Video-based online educational content has been more popular nowadays. However, due to the limited communication and interaction between the learners and instructors, various problems regarding learning performance have occurred. Gaze sharing techniques received much attention as a means to address this problem, however, there still exists a lot of room for improvement. In this work-in-progress paper, we introduce some possible improvement points regarding gaze visualization strategies and report the preliminary results of our first step towards our final goal. Through a user study with 30 university students, we found the feasibility of the prototype system and the future directions of our research.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116660001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mind Wandering Trait-level Tendencies During Lecture Viewing: A Pilot Study 观看讲座时走神特质水平的倾向:一项初步研究
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3529241
Francesca Zermiani, A. Bulling, M. Wirzberger
{"title":"Mind Wandering Trait-level Tendencies During Lecture Viewing: A Pilot Study","authors":"Francesca Zermiani, A. Bulling, M. Wirzberger","doi":"10.1145/3517031.3529241","DOIUrl":"https://doi.org/10.1145/3517031.3529241","url":null,"abstract":"Mind wandering (MW) is defined as a shift of attention to task-unrelated internal thoughts that is pervasive and disruptive for learning performance. Current state-of-the-art gaze-based attention-aware intelligent systems are capable of detecting MW from eye movements and delivering interventions to mitigate its negative effects. However, the beneficial functions of MW and its trait-level tendency, defined as the content of MW experience, are still largely neglected by these systems. In this pilot study, we address the questions of whether different MW trait-level tendencies can be detected through off-screen fixations’ frequency and duration and blink rate during a lecture viewing task. We focus on prospective planning and creative problem-solving as two of the main MW trait-level tendencies. Despite the non-significance, the descriptive values show a higher frequency and duration of off-screen fixations, but lower blink rate, in the creative problem-solving MW condition. Interestingly, we do find a highly significant correlation between MW level and engagement scores in the prospective planning MW group. Potential explanations for the observed results are discussed. Overall, these findings represent a preliminary step towards the development of more accurate and adaptive learning technologies, and call for further studies on MW trait-level tendency detection.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128902757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Faster, Better Blink Detection through Curriculum Learning by Augmentation 更快,更好的眨眼检测通过增强课程学习
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3529617
A. Al-Hindawi, Marcela P. Vizcaychipi, Y. Demiris
{"title":"Faster, Better Blink Detection through Curriculum Learning by Augmentation","authors":"A. Al-Hindawi, Marcela P. Vizcaychipi, Y. Demiris","doi":"10.1145/3517031.3529617","DOIUrl":"https://doi.org/10.1145/3517031.3529617","url":null,"abstract":"Blinking is a useful biological signal that can gate gaze regression models to avoid the use of incorrect data in downstream tasks. Existing datasets are imbalanced both in frequency of class but also in intra-class difficulty which we demonstrate is a barrier for curriculum learning. We thus propose a novel curriculum augmentation scheme that aims to address frequency and difficulty imbalances implicitly which are are terming Curriculum Learning by Augmentation (CLbA). Using Curriculum Learning by Augmentation (CLbA), we achieve a state-of-the-art performance of mean Average Precision (mAP) 0.971 using ResNet-18 up from the previous state-of-the-art of mean Average Precision (mAP) of 0.757 using DenseNet-121 whilst outcompeting Curriculum Learning by Bootstrapping (CLbB) by a significant margin with improved calibration. This new training scheme thus allows the use of smaller and more performant Convolutional Neural Network (CNN) backbones fulfilling Nyquist criteria to achieve a sampling frequency of 102.3Hz. This paves the way for inference of blinking in real-time applications.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117054951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Inferring Native and Non-Native Human Reading Comprehension and Subjective Text Difficulty from Scanpaths in Reading 从阅读扫描路径推断母语和非母语人类阅读理解和主观文本难度
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3529639
David Reich, Paul Prasse, Chiara Tschirner, Patrick Haller, Frank Goldhammer, L. Jäger
{"title":"Inferring Native and Non-Native Human Reading Comprehension and Subjective Text Difficulty from Scanpaths in Reading","authors":"David Reich, Paul Prasse, Chiara Tschirner, Patrick Haller, Frank Goldhammer, L. Jäger","doi":"10.1145/3517031.3529639","DOIUrl":"https://doi.org/10.1145/3517031.3529639","url":null,"abstract":"Eye movements in reading are known to reflect cognitive processes involved in reading comprehension at all linguistic levels, from the sub-lexical to the discourse level. This means that reading comprehension and other properties of the text and/or the reader should be possible to infer from eye movements. Consequently, we develop the first neural sequence architecture for this type of tasks which models scan paths in reading and incorporates lexical, semantic and other linguistic features of the stimulus text. Our proposed model outperforms state-of-the-art models in various tasks. These include inferring reading comprehension or text difficulty, and assessing whether the reader is a native speaker of the text’s language. We further conduct an ablation study to investigate the impact of each component of our proposed neural network on its performance.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"190 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116979076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Comparison of Webcam and Remote Eye Tracking 网络摄像头与远程眼动追踪的比较
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3529615
K. Wisiecka, Krzysztof Krejtz, I. Krejtz, Damian Sromek, Adam Cellary, Beata Lewandowska, A. Duchowski
{"title":"Comparison of Webcam and Remote Eye Tracking","authors":"K. Wisiecka, Krzysztof Krejtz, I. Krejtz, Damian Sromek, Adam Cellary, Beata Lewandowska, A. Duchowski","doi":"10.1145/3517031.3529615","DOIUrl":"https://doi.org/10.1145/3517031.3529615","url":null,"abstract":"We compare the measurement error and validity of webcam-based eye tracking to that of a remote eye tracker as well as software integration of both. We ran a study with n = 83 participants, consisting of a point detection task and an emotional visual search task under three between-subjects experimental conditions (webcam-based, remote, and integrated). We analyzed location-based (e.g., fixations) and process-based eye tracking metrics (ambient-focal attention dynamics). Despite higher measurement error of webcam eye tracking, our results in all three experimental conditions were in line with theoretical expectations. For example, time to first fixation toward happy faces was significantly shorter than toward sad faces (the happiness-superiority effect). As expected, we also observed the switch from ambient to focal attention depending on complexity of the visual stimuli. We conclude that webcam-based eye tracking is a viable, low-cost alternative to remote eye tracking.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132687293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
A study on the generalizability of Oculomotor Plant Mathematical Model 眼动植物数学模型的泛化性研究
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3532523
Dmytro Katrychuk, Oleg V. Komogortsev
{"title":"A study on the generalizability of Oculomotor Plant Mathematical Model","authors":"Dmytro Katrychuk, Oleg V. Komogortsev","doi":"10.1145/3517031.3532523","DOIUrl":"https://doi.org/10.1145/3517031.3532523","url":null,"abstract":"The Oculomotor plant mathematical model (OPMM) is a dynamic system that describes a human eye in motion. In this study, we focus on an anatomically inspired homeomorphic model where every component is a mathematical representation of a certain biological phenomenon of a real oculomotor plant. This approach estimates internal state of oculomotor plant from recorded eye movements. In the past, the utility of such models was shown to be useful in biometrics and gaze contingent rendering via eye movement prediction. In previous studies, an implicit underlying assumption was that a set of parameters estimated for a certain subject should remain consistent in time and generalize to unseen data. We note a major drawback of the prior work, as it operated under this assumption without explicit validation. This work creates a quantifiable baseline for the specific OPMM where the generalizability of the model parameters is the foundational property of their estimation.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116436782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Linked and Coordinated Visual Analysis of Eye Movement Data 眼动数据的关联和协调视觉分析
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3531163
Michael Burch, Günter Wallner, Veerle Fürst, Teodor-Cristian Lungu, Daan Boelhouwers, Dhiksha Rajasekaran, Richard Farla, Sander van Heesch
{"title":"Linked and Coordinated Visual Analysis of Eye Movement Data","authors":"Michael Burch, Günter Wallner, Veerle Fürst, Teodor-Cristian Lungu, Daan Boelhouwers, Dhiksha Rajasekaran, Richard Farla, Sander van Heesch","doi":"10.1145/3517031.3531163","DOIUrl":"https://doi.org/10.1145/3517031.3531163","url":null,"abstract":"Eye movement data can be used for a variety of research in marketing, advertisement, and other design-related industries to gain interesting insights into customer preferences. However, interpreting such data can be a challenging task due to its spatio-temporal complexity. In this paper we describe a web-based tool that has been developed to provide various visualizations for interpreting eye movement data of static stimuli. The tool provides several techniques to visualize and analyze eye movement data. These visualizations are interactive and linked in a coordinated way to help gain more insights. Overall, this paper illustrates the features and functionality offered by the tool by using data recorded from transport map readers in a previously conducted experiment as use case. Furthermore, the paper discusses limitations of the tool and possible future developments.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116846308","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Characterizing the expertise of Aircraft Maintenance Technicians using eye-tracking. 利用眼动追踪技术表征飞机维修技术人员的专业知识。
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3532199
F. Paris, Remy Casanova, M. Bergeonneau, D. Mestre
{"title":"Characterizing the expertise of Aircraft Maintenance Technicians using eye-tracking.","authors":"F. Paris, Remy Casanova, M. Bergeonneau, D. Mestre","doi":"10.1145/3517031.3532199","DOIUrl":"https://doi.org/10.1145/3517031.3532199","url":null,"abstract":"Aircraft maintenance technicians (AMTs) play an essential role in life-long security of helicopters. There are two major types of operations in maintenance activity: information intake/processing and motor actions. Modeling expertise of the AMT is the main objective of this doctoral project. Given the constraints of real-world research, mobile eye-tracking appears to be an essential tool for the measurement of information intake, notably concerning the use of maintenance documentation during the maintenance task preparation and execution. . This extended abstract will present the main research objectives, our approach and methodology and some preliminary results.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122351282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Distance between gaze and laser pointer predicts performance in video-based e-learning independent of the presence of an on-screen instructor 凝视和激光笔之间的距离可以预测基于视频的电子学习的表现,而不依赖于屏幕上的讲师
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3529620
Marian Sauter, Tobias Wagner, A. Huckauf
{"title":"Distance between gaze and laser pointer predicts performance in video-based e-learning independent of the presence of an on-screen instructor","authors":"Marian Sauter, Tobias Wagner, A. Huckauf","doi":"10.1145/3517031.3529620","DOIUrl":"https://doi.org/10.1145/3517031.3529620","url":null,"abstract":"In online lectures, showing an on-screen instructor gained popularity amidst the Covid-19 pandemic. However, evidence in favor of this is mixed: they draw attention and may distract from the content. In contrast, using signaling (e.g., with a digital pointer) provides known benefits for learners. But effects of signaling were only researched in absence of an on-screen instructor. In the present explorative study, we investigated effects of an on-screen instructor on the division of learners´ attention; specifically, on following a digital pointer signal with their gaze. The presence of an instructor led to an increased number of fixations in the presenter area. This did neither affect learning outcomes nor gaze patterns following the pointer. The average distance between the learner's gaze and the pointer position predicts the student's quiz performance, independent of the presence of an on-screen instructor. This can also help in creating automated immediate-feedback systems for educational videos.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128211540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Scanpath Comparison using ScanGraph for Education and Learning Purposes: Summary of previous educational studies performed with the use of ScanGraph 使用ScanGraph进行教育和学习目的的扫描路径比较:使用ScanGraph进行以前的教育研究的总结
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3529243
S. Popelka, Marketa Beitlova
{"title":"Scanpath Comparison using ScanGraph for Education and Learning Purposes: Summary of previous educational studies performed with the use of ScanGraph","authors":"S. Popelka, Marketa Beitlova","doi":"10.1145/3517031.3529243","DOIUrl":"https://doi.org/10.1145/3517031.3529243","url":null,"abstract":"∗ The short paper describes the summary of previous studies from the area of education where a developed tool for scanpath comparison called ScanGraph has been used so far. This paper aims to introduce this freely available online tool to the community of eye movement researchers focusing on eye-tracking in education. ScanGraph allows calculation of similarity using Levenshtein and Damerau-Levenshtein algorithms and the Needleman-Wunsch algorithm (similar to ScanMatch). The results are visualized in a simple graph showing similarities among individual participants. The tool allows exporting similarity matrix, which might be further used for more detailed analysis. Moreover, it is possible to visualize similarity data calculated using the MultiMatch method. In the article, the tool’s functionality is described and introduced on the examples of case studies from the field of geographic education and physics.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114268302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信