使用偏转法的VR眼动追踪

Jiazhang Wang, Bingjie Xu, Tianfu Wang, W. J. Lee, M. Walton, N. Matsuda, O. Cossairt, F. Willomitzer
{"title":"使用偏转法的VR眼动追踪","authors":"Jiazhang Wang, Bingjie Xu, Tianfu Wang, W. J. Lee, M. Walton, N. Matsuda, O. Cossairt, F. Willomitzer","doi":"10.1364/cosi.2021.cf2e.3","DOIUrl":null,"url":null,"abstract":"We present a novel approach for accurate eye tracking as required, e.g., in VR/AR/MR headsets. Our method exploits the retrieved surface normals and dense 3D features extracted from deflectometry measurements to estimate the gazing direction. © 2021 The Author(s) 1.​ ​Introduction Although the task has been studied for several decades, a robust solution to accurate and fast eye tracking remains an unsolved problem. With the advent of Virtual, Augmented, or Mixed Reality (VR/AR/MR), accurate eye tracking recently attracted considerable research interest mainly because it enables many functions that significantly improve the performance and experience of VR/AR/MR headsets, such as foveated rendering, or compensating for the accommodation-convergence reflex. To estimate the gazing direction of the human eye, current approaches either utilize 2D features detected from 2D eye images, or exploit sparse reflections of a few point light sources at the eye surface (“corneal/scleral reflections”). The latter retrieves 3D surface information for an improved gazing direction calculation, albeit only at maximal ~10 surface points. In this contribution we introduce an approach that significantly increases the information content provided from corneal or scleral reflections by using Deflectometry to acquire a dense and precise 3D model of the eye surface. The acquisition of ~1 million surface points per measurement step is easily achievable with off-the-shelf hardware. We exploit the retrieved surface normals and dense 3D features estimated via deflectometry to accurately estimate the gazing direction. 2. Method and Results Deflectometry is an established method in surface metrology to reconstruct the 3D surface of specular objects, such as freeform lenses, car windshields, or technical parts [1-3]: The reflection of a screen displaying a known pattern (e.g. a sinusoid) is observed after reflection from the specular surface under test. From the deformation of the pattern in the camera image, the normal vectors of the surface (and eventually the surface shape via integration) can be calculated. The inherent depth-normal-ambiguity is solved by adding a second camera, which results in a so-called “Stereo-Deflectometry” system [1]. Our proposed method utilizes Deflectometry for a dense and precise measurement of the eye surface. To calculate the gazing direction we first trace back the measured surface normal vectors towards the center of the eye. Due to the vastly different radii of cornea and sclera, the back-traced surface normals aggregate at two points inside the virtual 3D eye model: the center of the corneal sphere and the center of the scleral sphere (see Fig.1.c). Eventually, we calculate the Fig. 1. Calculating the gazing direction using dense 3D surface measurements. a) Deflectometry measurement: Camera image of the sinusoidal screen pattern reflected from the eye surface. b) Error map: calculated normal map w.r.t. the ground truth (error in degrees). c) Calculation of the gazing direction by tracing back the measured surface normals to the scleral and corneal center. CF2E.3.pdf OSA Imaging and Applied Optics Congress 2021 © OSA 2021","PeriodicalId":19628,"journal":{"name":"OSA Imaging and Applied Optics Congress 2021 (3D, COSI, DH, ISA, pcAOP)","volume":"26 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"VR Eye-Tracking using Deflectometry\",\"authors\":\"Jiazhang Wang, Bingjie Xu, Tianfu Wang, W. J. Lee, M. Walton, N. Matsuda, O. Cossairt, F. Willomitzer\",\"doi\":\"10.1364/cosi.2021.cf2e.3\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a novel approach for accurate eye tracking as required, e.g., in VR/AR/MR headsets. Our method exploits the retrieved surface normals and dense 3D features extracted from deflectometry measurements to estimate the gazing direction. © 2021 The Author(s) 1.​ ​Introduction Although the task has been studied for several decades, a robust solution to accurate and fast eye tracking remains an unsolved problem. With the advent of Virtual, Augmented, or Mixed Reality (VR/AR/MR), accurate eye tracking recently attracted considerable research interest mainly because it enables many functions that significantly improve the performance and experience of VR/AR/MR headsets, such as foveated rendering, or compensating for the accommodation-convergence reflex. To estimate the gazing direction of the human eye, current approaches either utilize 2D features detected from 2D eye images, or exploit sparse reflections of a few point light sources at the eye surface (“corneal/scleral reflections”). The latter retrieves 3D surface information for an improved gazing direction calculation, albeit only at maximal ~10 surface points. In this contribution we introduce an approach that significantly increases the information content provided from corneal or scleral reflections by using Deflectometry to acquire a dense and precise 3D model of the eye surface. The acquisition of ~1 million surface points per measurement step is easily achievable with off-the-shelf hardware. We exploit the retrieved surface normals and dense 3D features estimated via deflectometry to accurately estimate the gazing direction. 2. Method and Results Deflectometry is an established method in surface metrology to reconstruct the 3D surface of specular objects, such as freeform lenses, car windshields, or technical parts [1-3]: The reflection of a screen displaying a known pattern (e.g. a sinusoid) is observed after reflection from the specular surface under test. From the deformation of the pattern in the camera image, the normal vectors of the surface (and eventually the surface shape via integration) can be calculated. The inherent depth-normal-ambiguity is solved by adding a second camera, which results in a so-called “Stereo-Deflectometry” system [1]. Our proposed method utilizes Deflectometry for a dense and precise measurement of the eye surface. To calculate the gazing direction we first trace back the measured surface normal vectors towards the center of the eye. Due to the vastly different radii of cornea and sclera, the back-traced surface normals aggregate at two points inside the virtual 3D eye model: the center of the corneal sphere and the center of the scleral sphere (see Fig.1.c). Eventually, we calculate the Fig. 1. Calculating the gazing direction using dense 3D surface measurements. a) Deflectometry measurement: Camera image of the sinusoidal screen pattern reflected from the eye surface. b) Error map: calculated normal map w.r.t. the ground truth (error in degrees). c) Calculation of the gazing direction by tracing back the measured surface normals to the scleral and corneal center. CF2E.3.pdf OSA Imaging and Applied Optics Congress 2021 © OSA 2021\",\"PeriodicalId\":19628,\"journal\":{\"name\":\"OSA Imaging and Applied Optics Congress 2021 (3D, COSI, DH, ISA, pcAOP)\",\"volume\":\"26 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"OSA Imaging and Applied Optics Congress 2021 (3D, COSI, DH, ISA, pcAOP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1364/cosi.2021.cf2e.3\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"OSA Imaging and Applied Optics Congress 2021 (3D, COSI, DH, ISA, pcAOP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1364/cosi.2021.cf2e.3","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

我们提出了一种新颖的方法,可以根据需要进行准确的眼动追踪,例如在VR/AR/MR耳机中。我们的方法利用从偏转测量中提取的表面法线和密集的三维特征来估计凝视方向。©2021作者虽然这项任务已经研究了几十年,但准确快速的眼动追踪仍然是一个未解决的问题。随着虚拟、增强或混合现实(VR/AR/MR)的出现,精确的眼动追踪最近引起了相当大的研究兴趣,主要是因为它可以实现许多功能,显着提高VR/AR/MR头显的性能和体验,例如注视点渲染,或补偿调节-收敛反射。为了估计人眼的凝视方向,目前的方法要么利用从2D眼睛图像中检测到的2D特征,要么利用眼表面几个点光源的稀疏反射(“角膜/巩膜反射”)。后者检索三维表面信息,用于改进凝视方向计算,尽管只在最大~10个表面点上。在这篇文章中,我们介绍了一种方法,通过使用偏转法获得密集而精确的眼表面3D模型,显著增加了角膜或巩膜反射提供的信息内容。使用现成的硬件,每个测量步骤可以轻松获得约100万个表面点。我们利用检索到的表面法线和通过偏转法估计的密集三维特征来准确估计凝视方向。2. 方法和结果偏转法是表面计量学中一种成熟的方法,用于重建镜面物体的三维表面,如自由曲面透镜、汽车挡风玻璃或技术部件[1-3]:从被测镜面反射后,观察显示已知图案(例如正弦)的屏幕的反射。从相机图像中图案的变形,可以计算出表面的法向量(最终通过积分得到表面形状)。通过增加第二个摄像头来解决固有的深度-法向模糊性,从而形成所谓的“立体偏转”系统[1]。我们提出的方法利用偏转法对眼睛表面进行密集和精确的测量。为了计算凝视方向,我们首先将测量到的表面法向量追踪到眼睛的中心。由于角膜和巩膜的半径差异很大,在虚拟3D眼模型内,回溯的表面法线聚集在两点:角膜球中心和巩膜球中心(见图1.c)。最后,我们计算出图1。利用密集三维曲面测量计算凝视方向。a)偏转测量:眼表反射的正弦波屏幕图案的相机图像。b)误差图:计算出的法线图w.r.t.地面真值(误差度)。c)将测量的表面法线回溯到巩膜和角膜中心,计算凝视方向。CF2E.3.pdf 2021 OSA成像与应用光学大会©OSA 2021
本文章由计算机程序翻译,如有差异,请以英文原文为准。
VR Eye-Tracking using Deflectometry
We present a novel approach for accurate eye tracking as required, e.g., in VR/AR/MR headsets. Our method exploits the retrieved surface normals and dense 3D features extracted from deflectometry measurements to estimate the gazing direction. © 2021 The Author(s) 1.​ ​Introduction Although the task has been studied for several decades, a robust solution to accurate and fast eye tracking remains an unsolved problem. With the advent of Virtual, Augmented, or Mixed Reality (VR/AR/MR), accurate eye tracking recently attracted considerable research interest mainly because it enables many functions that significantly improve the performance and experience of VR/AR/MR headsets, such as foveated rendering, or compensating for the accommodation-convergence reflex. To estimate the gazing direction of the human eye, current approaches either utilize 2D features detected from 2D eye images, or exploit sparse reflections of a few point light sources at the eye surface (“corneal/scleral reflections”). The latter retrieves 3D surface information for an improved gazing direction calculation, albeit only at maximal ~10 surface points. In this contribution we introduce an approach that significantly increases the information content provided from corneal or scleral reflections by using Deflectometry to acquire a dense and precise 3D model of the eye surface. The acquisition of ~1 million surface points per measurement step is easily achievable with off-the-shelf hardware. We exploit the retrieved surface normals and dense 3D features estimated via deflectometry to accurately estimate the gazing direction. 2. Method and Results Deflectometry is an established method in surface metrology to reconstruct the 3D surface of specular objects, such as freeform lenses, car windshields, or technical parts [1-3]: The reflection of a screen displaying a known pattern (e.g. a sinusoid) is observed after reflection from the specular surface under test. From the deformation of the pattern in the camera image, the normal vectors of the surface (and eventually the surface shape via integration) can be calculated. The inherent depth-normal-ambiguity is solved by adding a second camera, which results in a so-called “Stereo-Deflectometry” system [1]. Our proposed method utilizes Deflectometry for a dense and precise measurement of the eye surface. To calculate the gazing direction we first trace back the measured surface normal vectors towards the center of the eye. Due to the vastly different radii of cornea and sclera, the back-traced surface normals aggregate at two points inside the virtual 3D eye model: the center of the corneal sphere and the center of the scleral sphere (see Fig.1.c). Eventually, we calculate the Fig. 1. Calculating the gazing direction using dense 3D surface measurements. a) Deflectometry measurement: Camera image of the sinusoidal screen pattern reflected from the eye surface. b) Error map: calculated normal map w.r.t. the ground truth (error in degrees). c) Calculation of the gazing direction by tracing back the measured surface normals to the scleral and corneal center. CF2E.3.pdf OSA Imaging and Applied Optics Congress 2021 © OSA 2021
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信