基于立体显示器的在线三维凝视定位

IF 1.9 4区 计算机科学 Q3 COMPUTER SCIENCE, SOFTWARE ENGINEERING
Rui I. Wang, Brandon Pelfrey, A. Duchowski, D. House
{"title":"基于立体显示器的在线三维凝视定位","authors":"Rui I. Wang, Brandon Pelfrey, A. Duchowski, D. House","doi":"10.1145/2593689","DOIUrl":null,"url":null,"abstract":"This article summarizes our previous work on developing an online system to allow the estimation of 3D gaze depth using eye tracking in a stereoscopic environment. We report on recent extensions allowing us to report the full 3D gaze position. Our system employs a 3D calibration process that determines the parameters of a mapping from a naive depth estimate, based simply on triangulation, to a refined 3D gaze point estimate tuned to a particular user. We show that our system is an improvement on the geometry-based 3D gaze estimation returned by a proprietary algorithm provided with our tracker. We also compare our approach with that of the Parameterized Self-Organizing Map (PSOM) method, due to Essig and colleagues, which also individually calibrates to each user. We argue that our method is superior in speed and ease of calibration, is easier to implement, and does not require an iterative solver to produce a gaze position, thus guaranteeing computation at the rate of tracker acquisition. In addition, we report on a user study that indicates that, compared with PSOM, our method more accurately estimates gaze depth, and is nearly as accurate in estimating horizontal and vertical position. Results are verified on two different 4D eye tracking systems, a high accuracy Wheatstone haploscope and a medium accuracy active stereo display. Thus, it is the recommended method for applications that primarily require gaze depth information, while its ease of use makes it suitable for many applications requiring full 3D gaze position.","PeriodicalId":50921,"journal":{"name":"ACM Transactions on Applied Perception","volume":"5 1","pages":"3:1-3:21"},"PeriodicalIF":1.9000,"publicationDate":"2014-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"34","resultStr":"{\"title\":\"Online 3D Gaze Localization on Stereoscopic Displays\",\"authors\":\"Rui I. Wang, Brandon Pelfrey, A. Duchowski, D. House\",\"doi\":\"10.1145/2593689\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This article summarizes our previous work on developing an online system to allow the estimation of 3D gaze depth using eye tracking in a stereoscopic environment. We report on recent extensions allowing us to report the full 3D gaze position. Our system employs a 3D calibration process that determines the parameters of a mapping from a naive depth estimate, based simply on triangulation, to a refined 3D gaze point estimate tuned to a particular user. We show that our system is an improvement on the geometry-based 3D gaze estimation returned by a proprietary algorithm provided with our tracker. We also compare our approach with that of the Parameterized Self-Organizing Map (PSOM) method, due to Essig and colleagues, which also individually calibrates to each user. We argue that our method is superior in speed and ease of calibration, is easier to implement, and does not require an iterative solver to produce a gaze position, thus guaranteeing computation at the rate of tracker acquisition. In addition, we report on a user study that indicates that, compared with PSOM, our method more accurately estimates gaze depth, and is nearly as accurate in estimating horizontal and vertical position. Results are verified on two different 4D eye tracking systems, a high accuracy Wheatstone haploscope and a medium accuracy active stereo display. Thus, it is the recommended method for applications that primarily require gaze depth information, while its ease of use makes it suitable for many applications requiring full 3D gaze position.\",\"PeriodicalId\":50921,\"journal\":{\"name\":\"ACM Transactions on Applied Perception\",\"volume\":\"5 1\",\"pages\":\"3:1-3:21\"},\"PeriodicalIF\":1.9000,\"publicationDate\":\"2014-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"34\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Applied Perception\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1145/2593689\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Applied Perception","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/2593689","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 34

摘要

本文总结了我们之前在开发一个在线系统方面的工作,该系统允许在立体环境中使用眼动追踪来估计3D凝视深度。我们报告了最近的扩展,使我们能够报告完整的3D凝视位置。我们的系统采用3D校准过程来确定映射的参数,从简单的基于三角测量的原始深度估计到针对特定用户的精细3D凝视点估计。我们表明,我们的系统是对我们的跟踪器提供的专有算法返回的基于几何的3D凝视估计的改进。我们还将我们的方法与Essig及其同事提出的参数化自组织映射(PSOM)方法进行了比较,后者也针对每个用户进行单独校准。我们认为,我们的方法在速度和易于校准方面具有优势,更容易实现,并且不需要迭代求解器来产生凝视位置,从而保证了跟踪器获取速率的计算。此外,我们报告的一项用户研究表明,与PSOM相比,我们的方法更准确地估计凝视深度,并且在估计水平和垂直位置方面几乎同样准确。在两种不同的4D眼动追踪系统,高精度惠斯通单倍镜和中等精度的主动立体显示器上验证了结果。因此,它是主要需要凝视深度信息的应用程序的推荐方法,而它的易用性使其适用于许多需要全3D凝视位置的应用程序。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Online 3D Gaze Localization on Stereoscopic Displays
This article summarizes our previous work on developing an online system to allow the estimation of 3D gaze depth using eye tracking in a stereoscopic environment. We report on recent extensions allowing us to report the full 3D gaze position. Our system employs a 3D calibration process that determines the parameters of a mapping from a naive depth estimate, based simply on triangulation, to a refined 3D gaze point estimate tuned to a particular user. We show that our system is an improvement on the geometry-based 3D gaze estimation returned by a proprietary algorithm provided with our tracker. We also compare our approach with that of the Parameterized Self-Organizing Map (PSOM) method, due to Essig and colleagues, which also individually calibrates to each user. We argue that our method is superior in speed and ease of calibration, is easier to implement, and does not require an iterative solver to produce a gaze position, thus guaranteeing computation at the rate of tracker acquisition. In addition, we report on a user study that indicates that, compared with PSOM, our method more accurately estimates gaze depth, and is nearly as accurate in estimating horizontal and vertical position. Results are verified on two different 4D eye tracking systems, a high accuracy Wheatstone haploscope and a medium accuracy active stereo display. Thus, it is the recommended method for applications that primarily require gaze depth information, while its ease of use makes it suitable for many applications requiring full 3D gaze position.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACM Transactions on Applied Perception
ACM Transactions on Applied Perception 工程技术-计算机:软件工程
CiteScore
3.70
自引率
0.00%
发文量
22
审稿时长
12 months
期刊介绍: ACM Transactions on Applied Perception (TAP) aims to strengthen the synergy between computer science and psychology/perception by publishing top quality papers that help to unify research in these fields. The journal publishes inter-disciplinary research of significant and lasting value in any topic area that spans both Computer Science and Perceptual Psychology. All papers must incorporate both perceptual and computer science components.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信