{"title":"Using Eye Tracking Data for Enhancing Adaptive Learning Systems","authors":"Kathrin Kennel","doi":"10.1145/3517031.3532195","DOIUrl":"https://doi.org/10.1145/3517031.3532195","url":null,"abstract":"Adaptive learning systems analyse a learner's input and respond on the basis of it, for example by providing individual feedback or selecting appropriate follow-up tasks. To provide good feedback, such a system must have a high diagnostic capability. The collection of gaze data alongside the traditional data obtained through mouse and keyboard input seems to be a promising approach for this. We use the example of graphical differentiation to investigate whether and how the integration of eye tracking data into such a system can succeed. For this purpose, we analyse students' eye tracking data and gather empirical understanding about which measures are suitable as decision support for adaptation","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129518184","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Poster: A Preliminary Investigation on Eye Gaze-based Concentration Recognition during Silent Reading of Text","authors":"Saki Tanaka, Airi Tsuji, K. Fujinami","doi":"10.1145/3517031.3531632","DOIUrl":"https://doi.org/10.1145/3517031.3531632","url":null,"abstract":"We propose machine learning models to recognize state of non-concentration using eye-gaze data to increase the productivity. The experimental results show that Random Forest classifier with a 12 s window can divide the states with an F1-score more than 0.9.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114413888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yao Wang, Maurice Koch, Mihai Bâce, D. Weiskopf, A. Bulling
{"title":"Impact of Gaze Uncertainty on AOIs in Information Visualisations","authors":"Yao Wang, Maurice Koch, Mihai Bâce, D. Weiskopf, A. Bulling","doi":"10.1145/3517031.3531166","DOIUrl":"https://doi.org/10.1145/3517031.3531166","url":null,"abstract":"Gaze-based analysis of areas of interest (AOIs) is widely used in information visualisation research to understand how people explore visualisations or assess the quality of visualisations concerning key characteristics such as memorability. However, nearby AOIs in visualisations amplify the uncertainty caused by the gaze estimation error, which strongly influences the mapping between gaze samples or fixations and different AOIs. We contribute a novel investigation into gaze uncertainty and quantify its impact on AOI-based analysis on visualisations using two novel metrics: the Flipping Candidate Rate (FCR) and Hit Any AOI Rate (HAAR). Our analysis of 40 real-world visualisations, including human gaze and AOI annotations, shows that gaze uncertainty frequently and significantly impacts the analysis conducted in AOI-based studies. Moreover, we analysed four visualisation types and found that bar and scatter plots are usually designed in a way that causes more uncertainty than line and pie plots in gaze-based analysis.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"47 65","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120942153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tracker/Camera Calibration for Accurate Automatic Gaze Annotation of Images and Videos","authors":"Swati Jindal, Harsimran Kaur, R. Manduchi","doi":"10.1145/3517031.3529643","DOIUrl":"https://doi.org/10.1145/3517031.3529643","url":null,"abstract":"Modern appearance-based gaze tracking algorithms require vast amounts of training data, with images of a viewer annotated with “ground truth” gaze direction. The standard approach to obtain gaze annotations is to ask subjects to fixate at specific known locations, then use a head model to determine the location of “origin of gaze”. We propose using an IR gaze tracker to generate gaze annotations in natural settings that do not require the fixation of target points. This requires prior geometric calibration of the IR gaze tracker with the camera, such that the data produced by the IR tracker can be expressed in the camera’s reference frame. This contribution introduces a simple tracker/camera calibration procedure based on the PnP algorithm and demonstrates its use to obtain a full characterization of gaze direction that can be used for ground truth annotation.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129797526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bhanuka Mahanama, Gavindya Jayawardena, Yasasi Abeysinghe, V. Ashok, S. Jayarathna
{"title":"Multidisciplinary Reading Patterns of Digital Documents","authors":"Bhanuka Mahanama, Gavindya Jayawardena, Yasasi Abeysinghe, V. Ashok, S. Jayarathna","doi":"10.1145/3517031.3531630","DOIUrl":"https://doi.org/10.1145/3517031.3531630","url":null,"abstract":"Reading plays a vital role in updating the researchers on recent developments in the field, including but not limited to solutions to various problems and collaborative studies between disciplines. Prior studies identify reading patterns to vary depending on the level of expertise of the researcher on the content of the document. We present a pilot study of eye-tracking measures during a reading task with participants across different areas of expertise with the intention of characterizing the reading patterns using both eye movement and pupillary information.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"229 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122791534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interaction Design of Dwell Selection Toward Gaze-based AR/VR Interaction","authors":"Toshiya Isomoto, Shota Yamanaka, B. Shizuki","doi":"10.1145/3517031.3531628","DOIUrl":"https://doi.org/10.1145/3517031.3531628","url":null,"abstract":"In this paper, we first position the current dwell selection among gaze-based interactions and its advantages against head-gaze selection, which is the mainstream interface for HMDs. Next, we show how dwell selection and head-gaze selection are used in an actual interaction situation. By comparing these two selection methods, we describe the potential of dwell selection as an essential AR/VR interaction.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"104 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134458725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"HPCGen: Hierarchical K-Means Clustering and Level Based Principal Components for Scan Path Genaration","authors":"Wolfgang Fuhl","doi":"10.1145/3517031.3529625","DOIUrl":"https://doi.org/10.1145/3517031.3529625","url":null,"abstract":"In this paper, we present a new approach for decomposing scan paths and its utility for generating new scan paths. For this purpose, we use the K-Means clustering procedure to the raw gaze data and subsequently iteratively to find more clusters in the found clusters. The found clusters are grouped for each level in the hierarchy, and the most important principal components are computed from the data contained in them. Using this tree hierarchy and the principal components, new scan paths can be generated that match the human behavior of the original data. We show that this generated data is very useful for generating new data for scan path classification but can also be used to generate fake scan paths. Code can be downloaded here https://atreus.informatik.uni-tuebingen.de/seafile/d/8e2ab8c3fdd444e1a135/?p=%2FHPCGen&mode=list.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"548 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123126816","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Assessment of the Eye Tracking Signal Quality Captured in the HoloLens 2","authors":"Samantha Aziz, Oleg V. Komogortsev","doi":"10.1145/3517031.3529626","DOIUrl":"https://doi.org/10.1145/3517031.3529626","url":null,"abstract":"We present an analysis of the eye tracking signal quality of the HoloLens 2’s integrated eye tracker. Signal quality was measured from eye movement data captured during a random saccades task from a new eye movement dataset collected on 30 healthy adults. We characterize the eye tracking signal quality of the device in terms of spatial accuracy, spatial precision, temporal precision, linearity, and crosstalk. Most notably, our evaluation of spatial accuracy reveals that the eye movement data in our dataset appears to be uncalibrated. Recalibrating the data using a subset of our dataset task produces notably better eye tracking signal quality.","PeriodicalId":339393,"journal":{"name":"2022 Symposium on Eye Tracking Research and Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121115751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}