2022 Symposium on Eye Tracking Research and Applications最新文献

筛选
英文 中文
Using Eye Tracking Data for Enhancing Adaptive Learning Systems 使用眼动追踪数据增强自适应学习系统
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3532195
Kathrin Kennel
{"title":"Using Eye Tracking Data for Enhancing Adaptive Learning Systems","authors":"Kathrin Kennel","doi":"10.1145/3517031.3532195","DOIUrl":"https://doi.org/10.1145/3517031.3532195","url":null,"abstract":"Adaptive learning systems analyse a learner's input and respond on the basis of it, for example by providing individual feedback or selecting appropriate follow-up tasks. To provide good feedback, such a system must have a high diagnostic capability. The collection of gaze data alongside the traditional data obtained through mouse and keyboard input seems to be a promising approach for this. We use the example of graphical differentiation to investigate whether and how the integration of eye tracking data into such a system can succeed. For this purpose, we analyse students' eye tracking data and gather empirical understanding about which measures are suitable as decision support for adaptation","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129518184","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Poster: A Preliminary Investigation on Eye Gaze-based Concentration Recognition during Silent Reading of Text 海报:文字默读过程中基于眼睛注视的注意力识别的初步研究
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3531632
Saki Tanaka, Airi Tsuji, K. Fujinami
{"title":"Poster: A Preliminary Investigation on Eye Gaze-based Concentration Recognition during Silent Reading of Text","authors":"Saki Tanaka, Airi Tsuji, K. Fujinami","doi":"10.1145/3517031.3531632","DOIUrl":"https://doi.org/10.1145/3517031.3531632","url":null,"abstract":"We propose machine learning models to recognize state of non-concentration using eye-gaze data to increase the productivity. The experimental results show that Random Forest classifier with a 12 s window can divide the states with an F1-score more than 0.9.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114413888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Impact of Gaze Uncertainty on AOIs in Information Visualisations 注视不确定性对信息可视化中aoi的影响
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-08 DOI: 10.1145/3517031.3531166
Yao Wang, Maurice Koch, Mihai Bâce, D. Weiskopf, A. Bulling
{"title":"Impact of Gaze Uncertainty on AOIs in Information Visualisations","authors":"Yao Wang, Maurice Koch, Mihai Bâce, D. Weiskopf, A. Bulling","doi":"10.1145/3517031.3531166","DOIUrl":"https://doi.org/10.1145/3517031.3531166","url":null,"abstract":"Gaze-based analysis of areas of interest (AOIs) is widely used in information visualisation research to understand how people explore visualisations or assess the quality of visualisations concerning key characteristics such as memorability. However, nearby AOIs in visualisations amplify the uncertainty caused by the gaze estimation error, which strongly influences the mapping between gaze samples or fixations and different AOIs. We contribute a novel investigation into gaze uncertainty and quantify its impact on AOI-based analysis on visualisations using two novel metrics: the Flipping Candidate Rate (FCR) and Hit Any AOI Rate (HAAR). Our analysis of 40 real-world visualisations, including human gaze and AOI annotations, shows that gaze uncertainty frequently and significantly impacts the analysis conducted in AOI-based studies. Moreover, we analysed four visualisation types and found that bar and scatter plots are usually designed in a way that causes more uncertainty than line and pie plots in gaze-based analysis.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"47 65","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120942153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Tracker/Camera Calibration for Accurate Automatic Gaze Annotation of Images and Videos 跟踪器/相机校准准确的图像和视频的自动凝视注释
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-06-01 DOI: 10.1145/3517031.3529643
Swati Jindal, Harsimran Kaur, R. Manduchi
{"title":"Tracker/Camera Calibration for Accurate Automatic Gaze Annotation of Images and Videos","authors":"Swati Jindal, Harsimran Kaur, R. Manduchi","doi":"10.1145/3517031.3529643","DOIUrl":"https://doi.org/10.1145/3517031.3529643","url":null,"abstract":"Modern appearance-based gaze tracking algorithms require vast amounts of training data, with images of a viewer annotated with “ground truth” gaze direction. The standard approach to obtain gaze annotations is to ask subjects to fixate at specific known locations, then use a head model to determine the location of “origin of gaze”. We propose using an IR gaze tracker to generate gaze annotations in natural settings that do not require the fixation of target points. This requires prior geometric calibration of the IR gaze tracker with the camera, such that the data produced by the IR tracker can be expressed in the camera’s reference frame. This contribution introduces a simple tracker/camera calibration procedure based on the PnP algorithm and demonstrates its use to obtain a full characterization of gaze direction that can be used for ground truth annotation.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129797526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Multidisciplinary Reading Patterns of Digital Documents 数字文献的多学科阅读模式
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-05-17 DOI: 10.1145/3517031.3531630
Bhanuka Mahanama, Gavindya Jayawardena, Yasasi Abeysinghe, V. Ashok, S. Jayarathna
{"title":"Multidisciplinary Reading Patterns of Digital Documents","authors":"Bhanuka Mahanama, Gavindya Jayawardena, Yasasi Abeysinghe, V. Ashok, S. Jayarathna","doi":"10.1145/3517031.3531630","DOIUrl":"https://doi.org/10.1145/3517031.3531630","url":null,"abstract":"Reading plays a vital role in updating the researchers on recent developments in the field, including but not limited to solutions to various problems and collaborative studies between disciplines. Prior studies identify reading patterns to vary depending on the level of expertise of the researcher on the content of the document. We present a pilot study of eye-tracking measures during a reading task with participants across different areas of expertise with the intention of characterizing the reading patterns using both eye movement and pupillary information.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"229 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122791534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Interaction Design of Dwell Selection Toward Gaze-based AR/VR Interaction 基于注视的AR/VR交互的居住选择交互设计
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-04-18 DOI: 10.1145/3517031.3531628
Toshiya Isomoto, Shota Yamanaka, B. Shizuki
{"title":"Interaction Design of Dwell Selection Toward Gaze-based AR/VR Interaction","authors":"Toshiya Isomoto, Shota Yamanaka, B. Shizuki","doi":"10.1145/3517031.3531628","DOIUrl":"https://doi.org/10.1145/3517031.3531628","url":null,"abstract":"In this paper, we first position the current dwell selection among gaze-based interactions and its advantages against head-gaze selection, which is the mainstream interface for HMDs. Next, we show how dwell selection and head-gaze selection are used in an actual interaction situation. By comparing these two selection methods, we describe the potential of dwell selection as an essential AR/VR interaction.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"104 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134458725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
HPCGen: Hierarchical K-Means Clustering and Level Based Principal Components for Scan Path Genaration HPCGen:扫描路径生成的分层k均值聚类和基于水平的主成分
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2022-01-19 DOI: 10.1145/3517031.3529625
Wolfgang Fuhl
{"title":"HPCGen: Hierarchical K-Means Clustering and Level Based Principal Components for Scan Path Genaration","authors":"Wolfgang Fuhl","doi":"10.1145/3517031.3529625","DOIUrl":"https://doi.org/10.1145/3517031.3529625","url":null,"abstract":"In this paper, we present a new approach for decomposing scan paths and its utility for generating new scan paths. For this purpose, we use the K-Means clustering procedure to the raw gaze data and subsequently iteratively to find more clusters in the found clusters. The found clusters are grouped for each level in the hierarchy, and the most important principal components are computed from the data contained in them. Using this tree hierarchy and the principal components, new scan paths can be generated that match the human behavior of the original data. We show that this generated data is very useful for generating new data for scan path classification but can also be used to generate fake scan paths. Code can be downloaded here https://atreus.informatik.uni-tuebingen.de/seafile/d/8e2ab8c3fdd444e1a135/?p=%2FHPCGen&mode=list.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"548 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123126816","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
An Assessment of the Eye Tracking Signal Quality Captured in the HoloLens 2 HoloLens捕获的眼动追踪信号质量评估2
2022 Symposium on Eye Tracking Research and Applications Pub Date : 2021-11-14 DOI: 10.1145/3517031.3529626
Samantha Aziz, Oleg V. Komogortsev
{"title":"An Assessment of the Eye Tracking Signal Quality Captured in the HoloLens 2","authors":"Samantha Aziz, Oleg V. Komogortsev","doi":"10.1145/3517031.3529626","DOIUrl":"https://doi.org/10.1145/3517031.3529626","url":null,"abstract":"We present an analysis of the eye tracking signal quality of the HoloLens 2’s integrated eye tracker. Signal quality was measured from eye movement data captured during a random saccades task from a new eye movement dataset collected on 30 healthy adults. We characterize the eye tracking signal quality of the device in terms of spatial accuracy, spatial precision, temporal precision, linearity, and crosstalk. Most notably, our evaluation of spatial accuracy reveals that the eye movement data in our dataset appears to be uncalibrated. Recalibrating the data using a subset of our dataset task produces notably better eye tracking signal quality.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121115751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信