M. Bernhard, Camillo Dell'mour, Michael Hecher, E. Stavrakis, M. Wimmer
{"title":"The effects of fast disparity adjustment in gaze-controlled stereoscopic applications","authors":"M. Bernhard, Camillo Dell'mour, Michael Hecher, E. Stavrakis, M. Wimmer","doi":"10.1145/2578153.2578169","DOIUrl":"https://doi.org/10.1145/2578153.2578169","url":null,"abstract":"With the emergence of affordable 3D displays, stereoscopy is becoming a commodity. However, often users report discomfort even after brief exposures to stereo content. One of the main reasons is the conflict between vergence and accommodation that is caused by 3D displays. We investigate dynamic adjustment of stereo parameters in a scene using gaze data in order to reduce discomfort. In a user study, we measured stereo fusion times after abrupt manipulation of disparities using gaze data. We found that gaze-controlled manipulation of disparities can lower fusion times for large disparities. In addition we found that gaze-controlled disparity adjustment should be applied in a personalized manner and ideally performed only at the extremities or outside the comfort zone of subjects. These results provide important insight on the problems associated with fast disparity manipulation and are essential for developing appealing gaze-contingent and gaze-controlled applications.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116508678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel J. Campbell, Joseph T. Chang, K. Chawarska, F. Shic
{"title":"Saliency-based Bayesian modeling of dynamic viewing of static scenes","authors":"Daniel J. Campbell, Joseph T. Chang, K. Chawarska, F. Shic","doi":"10.1145/2578153.2578159","DOIUrl":"https://doi.org/10.1145/2578153.2578159","url":null,"abstract":"Most analytic approaches for eye-tracking data focus either on identification of fixations and saccades, or on estimating saliency properties. Analyzing both aspects of visual attention simultaneously provides a more comprehensive view of strategies used to process information. This work presents a method that incorporates both aspects in a unified Bayesian model to jointly estimate dynamic properties of scanpaths and a saliency map. Performance of the model is assessed on simulated data and on eye-tracking data from 15 children with autism spectrum disorder and 13 control children. Saliency differences between ASD and TD groups were found for both social and non-social images, but differences in dynamic gaze features were evident in only a subset of social images. These results are consistent with previous region-based analyses as well as previous fixation parameter models, suggesting that the new approach may provide synthesizing and statistical perspectives on eye-tracking analyses.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114451747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael Raschke, Dominik Herr, Tanja Blascheck, T. Ertl, Michael Burch, Sven Willmann, M. Schrauf
{"title":"A visual approach for scan path comparison","authors":"Michael Raschke, Dominik Herr, Tanja Blascheck, T. Ertl, Michael Burch, Sven Willmann, M. Schrauf","doi":"10.1145/2578153.2628810","DOIUrl":"https://doi.org/10.1145/2578153.2628810","url":null,"abstract":"Several algorithms, approaches, and implementations have been developed to support comparison of scan paths and finding of interesting scan path structures. In this work we contribute a visual approach to support scan path comparison. A key feature of this approach is the combination of a clustering algorithm using Levenshtein distance with the parallel scan path visualization technique. The combination of computational methods with an interactive visualization allows us to use both the power of pattern finding algorithms and the human ability to visually recognize patterns. To use the concept in practice we implemented the approach in a prototype and show its application in two scan path analysis scenarios from automobile usability testing and visualization research.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"125 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116033331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrea Mazzei, Shahram Eivazi, Youri Marko, F. Kaplan, P. Dillenbourg
{"title":"3D model-based gaze estimation in natural reading: a systematic error correction procedure based on annotated texts","authors":"Andrea Mazzei, Shahram Eivazi, Youri Marko, F. Kaplan, P. Dillenbourg","doi":"10.1145/2578153.2578164","DOIUrl":"https://doi.org/10.1145/2578153.2578164","url":null,"abstract":"Studying natural reading and its underlying attention processes requires devices that are able to provide precise measurements of gaze without rendering the reading activity unnatural. In this paper we propose an eye tracking system that can be used to conduct analyses of reading behavior in low constrained experimental settings. The system is designed for dual-camera-based head-mounted eye trackers and allows free head movements and note taking. The system is composed of three different modules. First, a 3D model-based gaze estimation method computes the reader's gaze trajectory. Second, a document image retrieval algorithm is used to recognize document pages and extract annotations. Third, a systematic error correction procedure is used to post-calibrate the system parameters and compensate for spatial drifts. The validation results show that the proposed method is capable of extracting reliable gaze data when reading in low constrained experimental conditions.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115224320","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Exploring the influence of audio in directing visual attention during dynamic content","authors":"Brooke E. Wooley, David S. March","doi":"10.1145/2578153.2578180","DOIUrl":"https://doi.org/10.1145/2578153.2578180","url":null,"abstract":"The mechanisms underlying the allocation of visual attention toward dynamic content are still largely unexplored. Due to the number of variables present during dynamic content, it is often difficult to confidently determine what components direct visual attention. In this study, we manipulated the presence of audio in an attempt to explore the contribution of audio in driving visual attention during dynamic content. Participants viewed a reel of non-global commercials while their eye movements were recorded. Participants were either exposed to content containing the original audio track or content in which the audio track was edited out. Dynamic heat maps were created for each ad in order to identify areas of high visual attention between the conditions. Fixation durations and fixation counts for each area of interest were then computed. Analyses showed that the presence of audio has an influence on the allocation of visual attention during dynamic content, most notably in regard to on-screen text. Understanding the influence of audio in directing visual attention may help future researchers control for the extraneous influence of audio in eye-tracking methodologies.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115229442","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael Raschke, Dominik Herr, Tanja Blascheck, T. Ertl, Michael Burch, Sven Willmann, M. Schrauf
{"title":"A visual approach for scan path comparison","authors":"Michael Raschke, Dominik Herr, Tanja Blascheck, T. Ertl, Michael Burch, Sven Willmann, M. Schrauf","doi":"10.1145/2578153.2578173","DOIUrl":"https://doi.org/10.1145/2578153.2578173","url":null,"abstract":"Several algorithms, approaches, and implementations have been developed to support comparison of scan paths and finding of interesting scan path structures. In this work we contribute a visual approach to support scan path comparison. A key feature of this approach is the combination of a clustering algorithm using Levenshtein distance with the parallel scan path visualization technique. The combination of computational methods with an interactive visualization allows us to use both the power of pattern finding algorithms and the human ability to visually recognize patterns. To use the concept in practice we implemented the approach in a prototype and show its application in two scan path analysis scenarios from automobile usability testing and visualization research.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129748431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Turner, A. Bulling, Jason Alexander, Hans-Werner Gellersen
{"title":"Cross-device gaze-supported point-to-point content transfer","authors":"J. Turner, A. Bulling, Jason Alexander, Hans-Werner Gellersen","doi":"10.1145/2578153.2578155","DOIUrl":"https://doi.org/10.1145/2578153.2578155","url":null,"abstract":"Within a pervasive computing environment, we see content on shared displays that we wish to acquire and use in a specific way i.e., with an application on a personal device, transferring from point-to-point. The eyes as input can indicate intention to interact with a service, providing implicit pointing as a result. In this paper we investigate the use of gaze and manual input for the positioning of gaze-acquired content on personal devices. We evaluate two main techniques, (1) Gaze Positioning, transfer of content using gaze with manual input to confirm actions, (2) Manual Positioning, content is selected with gaze but final positioning is performed by manual input, involving a switch of modalities from gaze to manual input. A first user study compares these techniques applied to direct and indirect manual input configurations, a tablet with touch input and a laptop with mouse input. A second study evaluated our techniques in an application scenario involving distractor targets. Our overall results showed general acceptance and understanding of all conditions, although there were clear individual user preferences dependent on familiarity and preference toward gaze, touch, or mouse input.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129504154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The use of eye-tracking in landscape perception research","authors":"Lien Dupont, V. Eetvelde","doi":"10.1145/2578153.2583036","DOIUrl":"https://doi.org/10.1145/2578153.2583036","url":null,"abstract":"The European Landscape Convention defines landscape as \"an area, as perceived by people, whose character is the result of the action and interaction of natural and/or human factors\" [Council of Europe 2000]. This definition puts people in the core of the landscape and makes them part of it while observing the landscape. In addition, the Convention emphasizes that landscape is an important public interest which determines a part of the quality of life for people everywhere. Consequently, an active participation of the public in landscape planning and management is strongly stimulated [Council of Europe 2000]. Regarding these statements, it would be beneficial to gain insights into people's observation and perception of landscapes to be able to use this knowledge for landscape planning and management. So far, different landscape perception paradigms have been formulated [Scott and Benson 2002] and analyzed using questionnaires and depth interviews. The most frequently used stimuli in these empirical researches are photographs or in situ observations [e.g. Ode et al. 2008; Palmer 2004; Tveit 2009]. Eye-tracking in combination with landscape photographs, however, offers an objective manner to measure people's observation of landscapes.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126286850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jia-Bin Huang, Q. Cai, Zicheng Liu, N. Ahuja, Zhengyou Zhang
{"title":"Towards accurate and robust cross-ratio based gaze trackers through learning from simulation","authors":"Jia-Bin Huang, Q. Cai, Zicheng Liu, N. Ahuja, Zhengyou Zhang","doi":"10.1145/2578153.2578162","DOIUrl":"https://doi.org/10.1145/2578153.2578162","url":null,"abstract":"Cross-ratio (CR) based methods offer many attractive properties for remote gaze estimation using a single camera in an uncalibrated setup by exploiting invariance of a plane projectivity. Unfortunately, due to several simplification assumptions, the performance of CR-based eye gaze trackers decays significantly as the subject moves away from the calibration position. In this paper, we introduce an adaptive homography mapping for achieving gaze prediction with higher accuracy at the calibration position and more robustness under head movements. This is achieved with a learning-based method for compensating both spatially-varying gaze errors and head pose dependent errors simultaneously in a unified framework. The model of adaptive homography is trained offline using simulated data, saving a tremendous amount of time in data collection. We validate the effectiveness of the proposed approach using both simulated and real data from a physical setup. We show that our method compares favorably against other state-of-the-art CR based methods.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128531115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Takahiro Yoshioka, S. Nakashima, J. Odagiri, Hideki Tomimori, Taku Fukui
{"title":"Pupil detection in the presence of specular reflection","authors":"Takahiro Yoshioka, S. Nakashima, J. Odagiri, Hideki Tomimori, Taku Fukui","doi":"10.1145/2578153.2582175","DOIUrl":"https://doi.org/10.1145/2578153.2582175","url":null,"abstract":"In this work we describe a method of pupil detection for subsequent gaze tracking, when specular reflection is present in the image. Gaze tracking commonly uses the spatial relationship between the pupil and corneal reflection, but is not robust when the user is wearing eyeglasses, since light reflected from the surroundings changes the appearance of the pupil. In this research we propose and evaluate a pupil detection method that can perform robustly even in the presence of such reflection.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125831362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}