Preethi Vaidyanathan, J. Pelz, Cecilia Ovesdotter Alm, P. Shi, Anne R. Haake
{"title":"Recurrence quantification analysis reveals eye-movement behavior differences between experts and novices","authors":"Preethi Vaidyanathan, J. Pelz, Cecilia Ovesdotter Alm, P. Shi, Anne R. Haake","doi":"10.1145/2578153.2578207","DOIUrl":"https://doi.org/10.1145/2578153.2578207","url":null,"abstract":"Understanding and characterizing perceptual expertise is a major bottleneck in developing intelligent systems. In knowledge-rich domains such as dermatology, perceptual expertise influences the diagnostic inferences made based on the visual input. This study uses eye movement data from 12 dermatology experts and 12 undergraduate novices while they inspected 34 dermatological images. This work investigates the differences in global and local temporal fixation patterns between the two groups using recurrence quantification analysis (RQA). The RQA measures reveal significant differences in both global and local temporal patterns between the two groups. Results show that experts tended to refixate previously inspected areas less often than did novices, and their refixations were more widely separated in time. Experts were also less likely to follow extended scan paths repeatedly than were novices. These results suggest the potential value of RQA measures in characterizing perceptual expertise. We also discuss potential use of the RQA method in understanding the interactions between experts' visual and linguistic behavior.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116320211","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Eye-movement sequence statistics and hypothesis-testing with classical recurrence analysis","authors":"T. P. Keane, N. Cahill, J. Pelz","doi":"10.1145/2578153.2578174","DOIUrl":"https://doi.org/10.1145/2578153.2578174","url":null,"abstract":"Dynamical systems analysis tools, like Recurrence Plotting (RP), allow for concise mathematical representations of complex systems with relatively simple descriptive metrics. These methods are invariant for phase-space trajectories of a time series from a dynamical system, allowing analyses on simplified data sets which preserve the system model's dynamics. In the past decade, recurrence methods have been applied to eye-tracking, but those analyses avoided Time-Delay Embedding (TDE). Without TDE, we lose the assumption that phase-space trajectories are being preserved in the recurrence plot. Thus, analysis has been typically limited to clustering fixation locations in the image space, instead of clustering data sequences in the phase space. We will show how classical recurrence analysis methods can be extended to allow for multi-modal data visualization and quantification, by presenting an open-source python implementation for analyzing eye movements.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123573707","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Elizabeth S. Kim, A. Naples, G. V. Gearty, Quan Wang, Seth Wallace, Carla A. Wall, Michael Perlmutter, F. Volkmar, F. Shic, L. Friedlaender, J. Kowitt, B. Reichow
{"title":"Development of an untethered, mobile, low-cost head-mounted eye tracker","authors":"Elizabeth S. Kim, A. Naples, G. V. Gearty, Quan Wang, Seth Wallace, Carla A. Wall, Michael Perlmutter, F. Volkmar, F. Shic, L. Friedlaender, J. Kowitt, B. Reichow","doi":"10.1145/2578153.2578209","DOIUrl":"https://doi.org/10.1145/2578153.2578209","url":null,"abstract":"Head-mounted eye-tracking systems allow us to observe participants' gaze behaviors in largely unconstrained, real-world settings. We have developed novel, untethered, mobile, low-cost, lightweight, easily-assembled head-mounted eye-tracking devices, comprised entirely of off-the-shelf components, including untethered, point-of-view, sports cameras. In total, the parts we have used cost ~$153, and we suggest untested alternative components that reduce the cost of parts to ~$31. Our device can be easily assembled using hobbying skills and techniques. We have developed hardware, software, and methodological techniques to perform point-of-regard estimation, and to temporally align scene and eye videos in the face of variable frame rate, which plagues low-cost, lightweight, untethered cameras. We describe an innovative technique for synchronizing eye and scene videos using synchronized flashing lights. Our hardware, software, and calibration designs will be made publicly available, and we describe them in detail here, to facilitate replication of our system. We also describe novel smooth-pursuit-based calibration methodology, which affords rich sampling of calibration data while compensating for lack of information regarding the extent of visibility on participants' scene recordings. Validation experiments indicate accuracy within 0.752 degrees of visual angle on average.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128923135","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. McIntire, J. McIntire, R. McKinley, C. Goodyear
{"title":"Detection of vigilance performance with pupillometry","authors":"L. McIntire, J. McIntire, R. McKinley, C. Goodyear","doi":"10.1145/2578153.2578177","DOIUrl":"https://doi.org/10.1145/2578153.2578177","url":null,"abstract":"Sustained attention (vigilance) is required for many professions such as air traffic controllers, imagery analysts, airport security screeners, and cyber operators. A lapse in attention in any of these environments can have deadly consequences. The purpose of this study was to determine the ability of pupillometry to detect changes in vigilance performance. Each participant performed a 40-minute vigilance task while wearing an eye-tracker on each of four separate days. Pupil diameter, pupil eccentricity, and pupil velocity all changed significantly over time (p<.05) during the task. Significant correlations indicate that all metrics increased as vigilance performance declined except for pupil diameter, which decreased and the pupil became miotic. These results are consistent with other research on attention, fatigue, and arousal levels. Using an eye-tracker to detect changes in pupillometry in an operational environment would allow interventions to be implemented.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129049458","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Characterizing visual attention during driving and non-driving hazard perception tasks in a simulated environment","authors":"A. Mackenzie, Julie M. Harris","doi":"10.1145/2578153.2578171","DOIUrl":"https://doi.org/10.1145/2578153.2578171","url":null,"abstract":"Research into driving skill, particularly of hazard perception, often involves studies where participants either view pictures of driving scenarios or use movie viewing paradigms. However oculomotor strategies tend to change between active and passive tasks and attentional limitations are introduced during real driving. Here we present a study using eye tracking methods, to contrast oculomotor behaviour differences across a passive video based hazard perception task and an active hazard perception simulated driving task. The differences presented highlight a requirement to study driving skill under more active conditions, where the participant is engaged with a driving task. Our results suggest that more standard, passive tests, may have limited utility when developing visual models of driving behaviour. The results presented here have implications for driver safety measures and provide further insights into how vision and action interact during natural activity.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"126 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130983162","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Rendering synthetic ground truth images for eye tracker evaluation","authors":"Lech Swirski, N. Dodgson","doi":"10.1145/2578153.2578188","DOIUrl":"https://doi.org/10.1145/2578153.2578188","url":null,"abstract":"When evaluating eye tracking algorithms, a recurring issue is what metric to use and what data to compare against. User studies are informative when considering the entire eye tracking system, however they are often unsatisfactory for evaluating the gaze estimation algorithm in isolation. This is particularly an issue when evaluating a system's component parts, such as pupil detection, pupil-to-gaze mapping or head pose estimation. Instead of user studies, eye tracking algorithms can be evaluated using simulated input video. We describe a computer graphics approach to creating realistic synthetic eye images, using a 3D model of the eye and head and a physically correct rendering technique. By using rendering, we have full control over the parameters of the scene such as the gaze vector or camera position, which allows the calculation of ground truth data, while creating a realistic input for a video-based gaze estimator.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126899417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Heatmap rendering from large-scale distributed datasets using cloud computing","authors":"Thanh-Chung Dao, R. Bednarik, Hana Vrzakova","doi":"10.1145/2578153.2578187","DOIUrl":"https://doi.org/10.1145/2578153.2578187","url":null,"abstract":"Heatmap is one of the most popular visualizations of gaze behavior, however, increasingly voluminous streams of eye-tracking data make processing of such visualization computationally demanding. Because of high requirements on a single processing machine, real-time visualizations from multiple users are unfeasible if rendered locally. We designed a framework that collects data from multiple eye-trackers regardless of their physical location, analyses these streams, and renders heatmaps in real-time. We propose a cloud computing architecture (EyeCloud) consisting of master and slave nodes on a cloud cluster, and a web interface for fast computation and effective aggregation of the large volumes of eye-tracking data. In experimental studies of the feasibility and effectiveness, we built a cloud cluster on a well-known service, implemented the architecture and reported on a comparison between the proposed system and traditional local processing. The results showed efficiency of the EyeCloud when recordings vary in durations. To our knowledge, this is the first solution to implement cloud computing for gaze visualization.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"169 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116696123","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Enkelejda Kasneci, G. Kasneci, Thomas C. Kübler, W. Rosenstiel
{"title":"The applicability of probabilistic methods to the online recognition of fixations and saccades in dynamic scenes","authors":"Enkelejda Kasneci, G. Kasneci, Thomas C. Kübler, W. Rosenstiel","doi":"10.1145/2578153.2578213","DOIUrl":"https://doi.org/10.1145/2578153.2578213","url":null,"abstract":"In many applications involving scanpath analysis, especially when dynamic scenes are viewed, consecutive fixations and saccades, have to be identified and extracted from raw eye-tracking data in an online fashion. Since probabilistic methods can adapt not only to the individual viewing behavior, but also to changes in the scene, they are best suited for such tasks. In this paper we analyze the applicability of two types of main-stream probabilistic models to the identification of fixations and saccades in dynamic scenes: (1) Hidden Markov Models and (2) Bayesian Online Mixture Models. We analyze and compare the classification performance of the models on eye-tracking data collected during real-world driving experiments.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114278707","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Deepak Akkil, Poika Isokoski, J. Kangas, Jussi Rantala, R. Raisamo
{"title":"TraQuMe: a tool for measuring the gaze tracking quality","authors":"Deepak Akkil, Poika Isokoski, J. Kangas, Jussi Rantala, R. Raisamo","doi":"10.1145/2578153.2578192","DOIUrl":"https://doi.org/10.1145/2578153.2578192","url":null,"abstract":"Consistent measuring and reporting of gaze data quality is important in research that involves eye trackers. We have developed TraQuMe: a generic system to evaluate the gaze data quality. The quality measurement is fast and the interpretation of the results is aided by graphical output. Numeric data is saved for reporting of aggregate metrics for the whole experiment. We tested TraQuMe in the context of a novel hidden calibration procedure that we developed to aid in experiments where participants should not know that their gaze is being tracked. The quality of tracking data after the hidden calibration procedure was very close to that obtained with the Tobii's T60 trackers built-in 2 point, 5 point and 9 point calibrations.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134267923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Realistic heatmap visualization for interactive analysis of 3D gaze data","authors":"M. Maurus, J. H. Hammer, J. Beyerer","doi":"10.1145/2578153.2578204","DOIUrl":"https://doi.org/10.1145/2578153.2578204","url":null,"abstract":"In this paper, a novel approach for real-time heatmap generation and visualization of 3D gaze data is presented. By projecting the gaze into the scene and considering occlusions from the observer's view, to our knowledge, for the first time a correct visualization of the actual scene perception in 3D environments is provided. Based on a graphics-centric approach utilizing the graphics pipeline, shaders and several optimization techniques, heatmap rendering is fast enough for an interactive online and offline gaze analysis of thousands of gaze samples.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134633998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}