Rui I. Wang, Brandon Pelfrey, A. Duchowski, D. House
{"title":"Online 3D Gaze Localization on Stereoscopic Displays","authors":"Rui I. Wang, Brandon Pelfrey, A. Duchowski, D. House","doi":"10.1145/2593689","DOIUrl":null,"url":null,"abstract":"This article summarizes our previous work on developing an online system to allow the estimation of 3D gaze depth using eye tracking in a stereoscopic environment. We report on recent extensions allowing us to report the full 3D gaze position. Our system employs a 3D calibration process that determines the parameters of a mapping from a naive depth estimate, based simply on triangulation, to a refined 3D gaze point estimate tuned to a particular user. We show that our system is an improvement on the geometry-based 3D gaze estimation returned by a proprietary algorithm provided with our tracker. We also compare our approach with that of the Parameterized Self-Organizing Map (PSOM) method, due to Essig and colleagues, which also individually calibrates to each user. We argue that our method is superior in speed and ease of calibration, is easier to implement, and does not require an iterative solver to produce a gaze position, thus guaranteeing computation at the rate of tracker acquisition. In addition, we report on a user study that indicates that, compared with PSOM, our method more accurately estimates gaze depth, and is nearly as accurate in estimating horizontal and vertical position. Results are verified on two different 4D eye tracking systems, a high accuracy Wheatstone haploscope and a medium accuracy active stereo display. Thus, it is the recommended method for applications that primarily require gaze depth information, while its ease of use makes it suitable for many applications requiring full 3D gaze position.","PeriodicalId":50921,"journal":{"name":"ACM Transactions on Applied Perception","volume":"5 1","pages":"3:1-3:21"},"PeriodicalIF":1.9000,"publicationDate":"2014-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"34","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Applied Perception","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/2593689","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 34
Abstract
This article summarizes our previous work on developing an online system to allow the estimation of 3D gaze depth using eye tracking in a stereoscopic environment. We report on recent extensions allowing us to report the full 3D gaze position. Our system employs a 3D calibration process that determines the parameters of a mapping from a naive depth estimate, based simply on triangulation, to a refined 3D gaze point estimate tuned to a particular user. We show that our system is an improvement on the geometry-based 3D gaze estimation returned by a proprietary algorithm provided with our tracker. We also compare our approach with that of the Parameterized Self-Organizing Map (PSOM) method, due to Essig and colleagues, which also individually calibrates to each user. We argue that our method is superior in speed and ease of calibration, is easier to implement, and does not require an iterative solver to produce a gaze position, thus guaranteeing computation at the rate of tracker acquisition. In addition, we report on a user study that indicates that, compared with PSOM, our method more accurately estimates gaze depth, and is nearly as accurate in estimating horizontal and vertical position. Results are verified on two different 4D eye tracking systems, a high accuracy Wheatstone haploscope and a medium accuracy active stereo display. Thus, it is the recommended method for applications that primarily require gaze depth information, while its ease of use makes it suitable for many applications requiring full 3D gaze position.
期刊介绍:
ACM Transactions on Applied Perception (TAP) aims to strengthen the synergy between computer science and psychology/perception by publishing top quality papers that help to unify research in these fields.
The journal publishes inter-disciplinary research of significant and lasting value in any topic area that spans both Computer Science and Perceptual Psychology. All papers must incorporate both perceptual and computer science components.