Real-Time Algorithms for Head Mounted Gaze Tracker

A. Starostenko, Filipp Kozin, R. Gorbachev
{"title":"Real-Time Algorithms for Head Mounted Gaze Tracker","authors":"A. Starostenko, Filipp Kozin, R. Gorbachev","doi":"10.1109/IC-AIAI48757.2019.00025","DOIUrl":null,"url":null,"abstract":"We introduce a set of real-time algorithms for head mounted gaze tracker consisting of three cameras: two cameras for the eyes and one camera for the scene. The direction of the optical axis of the eye in three-dimensional space is calculated using the reflection of IR LEDs from the cornea. Individual features of the user are taken into account using the short-term calibration procedure. The described algorithms combine high accuracy in determining the point of gaze with high speed. The procedure for determining the point of gaze consists of the following algorithms: estimation of the position of the pupils on the eye cameras frames using of the threshold processing taking into account the histogram of the frame and further approximation of the pupil by an ellipse; estimation of the IR LEDs glare position on the frames of the eye cameras using threshold processing; filtration of the glares by brightness, size, circularity, and of the glares beyond the iris, the size of the iris is estimated by the distance from eye camera to pupil position calculated on the previous frame; indexation of the glares with the template matching; estimation of the optical axis angles of the eye using a spherical model of the cornea with the nonlinear optimization methods; estimation of the point of gaze on the scene camera frame using individual user features found during the calibration process. During calibration, the movement of the ArUco calibration mark and its selection on the scene camera frame are used. To calculate the gaze position on the scene camera, a regression algorithm is used, which implicitly takes into account the individual characteristics of the user.","PeriodicalId":374193,"journal":{"name":"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC-AIAI48757.2019.00025","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We introduce a set of real-time algorithms for head mounted gaze tracker consisting of three cameras: two cameras for the eyes and one camera for the scene. The direction of the optical axis of the eye in three-dimensional space is calculated using the reflection of IR LEDs from the cornea. Individual features of the user are taken into account using the short-term calibration procedure. The described algorithms combine high accuracy in determining the point of gaze with high speed. The procedure for determining the point of gaze consists of the following algorithms: estimation of the position of the pupils on the eye cameras frames using of the threshold processing taking into account the histogram of the frame and further approximation of the pupil by an ellipse; estimation of the IR LEDs glare position on the frames of the eye cameras using threshold processing; filtration of the glares by brightness, size, circularity, and of the glares beyond the iris, the size of the iris is estimated by the distance from eye camera to pupil position calculated on the previous frame; indexation of the glares with the template matching; estimation of the optical axis angles of the eye using a spherical model of the cornea with the nonlinear optimization methods; estimation of the point of gaze on the scene camera frame using individual user features found during the calibration process. During calibration, the movement of the ArUco calibration mark and its selection on the scene camera frame are used. To calculate the gaze position on the scene camera, a regression algorithm is used, which implicitly takes into account the individual characteristics of the user.
头戴式凝视跟踪器的实时算法
我们介绍了一套头戴式凝视跟踪器的实时算法,该跟踪器由三个摄像头组成:两个摄像头用于眼睛,一个摄像头用于场景。眼睛的光轴在三维空间的方向是通过角膜的红外led反射来计算的。使用短期校准程序时,考虑到用户的个人特征。所描述的算法结合了高精确度和高速度的注视点确定。确定凝视点的过程包括以下算法:利用考虑到帧的直方图和瞳孔的椭圆进一步逼近的阈值处理来估计眼睛相机帧上的瞳孔位置;利用阈值处理估计红外led在眼相机帧上的眩光位置;通过亮度、大小、圆度对眩光进行过滤,对于虹膜以外的眩光,虹膜的大小由前一帧计算的眼睛相机到瞳孔位置的距离来估计;基于模板匹配的眩光索引基于角膜球面模型的眼光轴角非线性优化估计使用在校准过程中发现的单个用户特征估计场景相机帧上的凝视点。在标定过程中,使用ArUco标定标记的移动及其在场景摄像机帧上的选择。为了计算场景相机上的凝视位置,使用了一种回归算法,该算法隐含地考虑了用户的个人特征。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信