{"title":"Modelling of Blink-Related Eyelid-Induced Shunting on the Electrooculogram","authors":"Nathaniel Barbara, T. Camilleri, K. Camilleri","doi":"10.1145/3448018.3457994","DOIUrl":"https://doi.org/10.1145/3448018.3457994","url":null,"abstract":"Besides the traditional regression model-based techniques to estimate the gaze angles (GAs) from electrooculography (EOG) signals, more recent works have investigated the use of a battery model for GA estimation. This is a white-box, explicit and physically-driven model which relates the monopolar EOG potential to the electrode-cornea and electrode-retina distances. In this work, this model is augmented to cater for the blink-induced EOG signal characteristics, by modelling the eyelid-induced shunting effect during blinks. Specifically, a channel-dependent parameter representing the extent to which the amount of eyelid opening affects the particular EOG channel is introduced. A method to estimate these parameters is also proposed and the proposed model is validated by incorporating it in a Kalman filter to estimate the eyelid opening during blinks. The results obtained have demonstrated that the proposed model can accurately represent the blink-related eyelid-induced shunting.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127477125","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"GazeHelp: Exploring Practical Gaze-assisted Interactions for Graphic Design Tools","authors":"Ryan Lewien","doi":"10.1145/3450341.3458764","DOIUrl":"https://doi.org/10.1145/3450341.3458764","url":null,"abstract":"This system development project introduces the Adobe Photoshop plugin GazeHelp, exploring the practical application of multimodal gaze-assisted interaction in assisting current graphic design activities. It implements three core features, including QuickTool: a gaze-triggered popup that allows the user to select their next tool with gaze; X-Ray: creating a small non-destructive window at the gaze point, cutting through an artboard’s layers to expose an element on a selected underlying layer; and Privacy Shield: dimming and blocking the current art board from view when looking away from the display. Each harness the speed, gaze-contingent observational nature and presence-implying strengths of gaze respectively, and are customisable to the user’s preferences. The accompanying GazeHelpServer, complete with intuitive GUI, can also be flexibly used by other programs and plugins for further development.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122218924","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anderson Schrader, Isabella Gebhart, Drew Garrison, A. Duchowski, Martian Lapadatescu, Weiyu Feng, Mahmoud Thabit, Fang Wang, Krzysztof Krejtz, Daniel D. Petty
{"title":"Toward Eye-Tracked Sideline Concussion Assessment in eXtended Reality","authors":"Anderson Schrader, Isabella Gebhart, Drew Garrison, A. Duchowski, Martian Lapadatescu, Weiyu Feng, Mahmoud Thabit, Fang Wang, Krzysztof Krejtz, Daniel D. Petty","doi":"10.1145/3448017.3457378","DOIUrl":"https://doi.org/10.1145/3448017.3457378","url":null,"abstract":"As there is no currently available portable, visuomotor assessment of concussion at the sidelines, we present preliminary development of an approach based on Predictive Visual Tracking (PVT) suitable for the sidelines. Previous work has shown PVT sensitivity and specificity of 0.85 and 0.73, respectively, for standard deviation of radial error for normal and acute concussion (mild Traumatic Brain Injury, or mTBI), using a simple orbiting target stimulus. We propose new variants of the radial and tangential error metrics and conduct preliminary evaluation in Virtual Reality when applied to two different target motions (orbit and pendulum). Our new local visualization is intuitive, especially when considering evaluation of the pendulum target. Initial results indicate promise for baseline-related, personalized concussion testing in extended reality.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129739540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Solving Parallax Error for 3D Eye Tracking","authors":"A. Gibaldi, Vasha Dutell, M. Banks","doi":"10.1145/3450341.3458494","DOIUrl":"https://doi.org/10.1145/3450341.3458494","url":null,"abstract":"Head-mounted eye-trackers allow for unrestricted behavior in the natural environment, but have calibration issues that compromise accuracy and usability. A well-known problem arises from the fact that gaze measurements suffer from parallax error due to the offset between the scene camera origin and eye position. To compensate for this error two pieces of data are required: the pose of the scene camera in head coordinates, and the three-dimensional coordinates of the fixation point in head coordinates. We implemented a method that allows for effective and accurate eye-tracking in the three-dimensional environment. Our approach consists of a calibration procedure that allows to contextually calibrate the eye-tracker and compute the eyes pose in the reference frame of the scene camera, and a custom stereoscopic scene camera that provides the three-dimensional coordinates of the fixation point. The resulting gaze data are free from parallax error, allowing accurate and effective use of the eye-tracker in the natural environment.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"75 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114091298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bharath Shankar, Christian Sinnott, Kamran Binaee, M. Lescroart, P. MacNeilage
{"title":"Ergonomic Design Development of the Visual Experience Database Headset","authors":"Bharath Shankar, Christian Sinnott, Kamran Binaee, M. Lescroart, P. MacNeilage","doi":"10.1145/3450341.3458487","DOIUrl":"https://doi.org/10.1145/3450341.3458487","url":null,"abstract":"Head-mounted devices allow recording of eye movements, head movements, and scene video outside of the traditional laboratory setting. A key challenge for recording comprehensive first-person stimuli and behavior outside the lab is the form factor of the head-mounted assembly. It should be mounted stably on the head to minimize slippage and maximize accuracy of the data; it should be as unobtrusive and comfortable as possible to allow for natural behaviors and enable longer duration recordings; and it should be able to fit a diverse user population. Here, we survey preliminary design iterations of the Visual Experience Database headset, an assembly consisting of the Pupil Core eye tracker, the Intel RealSense T265 ™ (T265) tracking camera, and the FLIR Chameleon™3 (FLIR) world camera. Strengths and weaknesses of each iteration are explored and documented with the goal of informing future ergonomic design efforts for similar head-mounted systems.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114560725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Margarita Vinnikov, Kiana Motahari, Louis I. Hamilton, B. Altin
{"title":"Understanding Urban Devotion through the Eyes of an Observer","authors":"Margarita Vinnikov, Kiana Motahari, Louis I. Hamilton, B. Altin","doi":"10.1145/3448018.3458003","DOIUrl":"https://doi.org/10.1145/3448018.3458003","url":null,"abstract":"If and how an individual’s social, economic, and cultural backgrounds affect their perception of the built environment, is a fundamental problem for architects, anthropologists, historians, and urban planners alike. Similar factors affect an individual’s religious beliefs and tendencies. Our research addresses the intersection of personal background and perception of sacred space by examining people’s responses to a virtual replica of a “madonella,” a street shrine in Rome. The shrine was virtually recreated using photogrammetry. It was optimized for user studies employing VIVE Pro Eye. The study looked at the gaze behavior of 24 participants and compared their gaze patterns with demographic background and social-communal responses. The study finds that certain religious habits of an individual could predict their fixational features, including the number and total duration of fixations, on pivotal areas of interest in the shrine environment (even though these areas were placed outside of immediate sight). These results are a promising start to our ongoing study of the perception and received meaning of sacred space.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114628533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael Burch, Günter Wallner, Nick Broeks, Lulof Piree, N. Boonstra, Paul Vlaswinkel, Silke Franken, Vince van Wijk
{"title":"The Power of Linked Eye Movement Data Visualizations","authors":"Michael Burch, Günter Wallner, Nick Broeks, Lulof Piree, N. Boonstra, Paul Vlaswinkel, Silke Franken, Vince van Wijk","doi":"10.1145/3448017.3457377","DOIUrl":"https://doi.org/10.1145/3448017.3457377","url":null,"abstract":"In this paper we showcase several eye movement data visualizations and how they can be interactively linked to design a flexible visualization tool for eye movement data. The aim of this project is to create a user-friendly and easy accessible tool to interpret visual attention patterns and to facilitate data analysis for eye movement data. Hence, to increase accessibility and usability we provide a web-based solution. Users can upload their own eye movement data set and inspect it from several perspectives simultaneously. Insights can be shared and collaboratively be discussed with others. The currently available visualization techniques are a 2D density plot, a scanpath representation, a bee swarm, and a scarf plot, all supporting several standard interaction techniques. Moreover, due to the linking feature, users can select data in one visualization, and the same data points will be highlighted in all active visualizations for solving comparison tasks. The tool also provides functions that make it possible to upload both, private or public data sets, and can generate URLs to share the data and settings of customized visualizations. A user study showed that the tool is understandable and that providing linked customizable views is beneficial for analyzing eye movement data.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126240093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Synchronization of Spontaneous Eyeblink during Formula Car Driving","authors":"Ryota Nishizono, Naoki Saijo, M. Kashino","doi":"10.1145/3448018.3458002","DOIUrl":"https://doi.org/10.1145/3448018.3458002","url":null,"abstract":"Formula car racing is a highly competitive sport. Previous studies have investigated the physiological characteristics and motor behaviors of drivers; however, little is known about how they modulate their cognitive states to improve their skills. Spontaneous eyeblink is a noteworthy factor because it reflects attentional states and is important for drivers to minimize the chance of losing critical visual information. In this study, we investigated whether the blink rate, blink synchronization among laps in each driver, and synchronization across drivers were related to their performance. Toward this end, we recorded the blinks and car behavior data of two professional drivers in quasi-racing environments. The results showed higher synchronization in higher-performance laps of each driver and across drivers but no significant change in blink rate. These results suggest that blink synchronization could reflect the changes in performance mode during formula car driving.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131908112","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Martin H. U. Prinzler, Christoph Schröder, Sahar Mahdie Klim Al Zaidawi, G. Zachmann, S. Maneth
{"title":"Visualizing Prediction Correctness of Eye Tracking Classifiers","authors":"Martin H. U. Prinzler, Christoph Schröder, Sahar Mahdie Klim Al Zaidawi, G. Zachmann, S. Maneth","doi":"10.1145/3448018.3457997","DOIUrl":"https://doi.org/10.1145/3448018.3457997","url":null,"abstract":"Eye tracking data is often used to train machine learning algorithms for classification tasks. The main indicator of performance for such classifiers is typically their prediction accuracy. However, this number does not reveal any information about the specific intrinsic workings of the classifier. In this paper we introduce novel visualization methods which are able to provide such information. We introduce the Prediction Correctness Value (PCV). It is the difference between the calculated probability for the correct class and the maximum calculated probability for any other class. Based on the PCV we present two visualizations: (1) coloring segments of eye tracking trajectories according to their PCV, thus indicating how beneficial certain parts are towards correct classification, and (2) overlaying similar information for all participants to produce a heatmap that indicates at which places fixations are particularly beneficial towards correct classification. Using these new visualizations we compare the performance of two classifiers (RF and RBFN).","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115431054","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"EyeTell: Tablet-based Calibration-free Eye-typing using Smooth-pursuit movements","authors":"Tanya Bafna, Per Baekgaard, J. P. Hansen","doi":"10.1145/3448018.3458015","DOIUrl":"https://doi.org/10.1145/3448018.3458015","url":null,"abstract":"Gaze tracking technology, with the increasingly robust and lightweight equipment, can have tremendous applications. To use the technology during short interactions, such as in public displays or hospitals to communicate non-verbally after a surgery, the application needs to be intuitive without requiring a calibration. Gaze gestures such as smooth-pursuit eye movements can be detected without calibration. We report the working performance of a calibration-free eye-typing application using only the front-facing camera of a tablet. In a user study with 29 participants, we obtained an average typing speed of 1.27 WPM after four trials and a maximum typing speed of 1.95 WPM.","PeriodicalId":226088,"journal":{"name":"ACM Symposium on Eye Tracking Research and Applications","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123219536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}