Free-Head Appearance-Based Eye Gaze Estimation on Mobile Devices

Jigang Liu, Bu-Sung Lee, D. Rajan
{"title":"Free-Head Appearance-Based Eye Gaze Estimation on Mobile Devices","authors":"Jigang Liu, Bu-Sung Lee, D. Rajan","doi":"10.1109/ICAIIC.2019.8669057","DOIUrl":null,"url":null,"abstract":"Eye gaze tracking plays an important role in human-computer interaction applications. In recent years, many research have been performed to explore gaze estimation methods to handle free-head movement, most of which focused on gaze direction estimation. Gaze point estimation on the screen is another important application. In this paper, we proposed a two-step training network, called GazeEstimator, to improve the estimation accuracy of gaze location on mobile devices. The first step is to train an eye landmarks localization network on 300W-LP dataset [1], and the second step is to train a gaze estimation network on GazeCapture dataset [2]. Some processing operations are performed between the two networks for data cleaning. The first network is able to localize eye precisely on the image, while the gaze estimation network use only eye images and eye grids as inputs, and it is robust to facial expressions and occlusion.Compared with state-of-the-art gaze estimation method, iTracker, our proposed deep network achieves higher accuracy and is able to estimate gaze location even in the condition that the full face cannot be detected.","PeriodicalId":273383,"journal":{"name":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC.2019.8669057","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16

Abstract

Eye gaze tracking plays an important role in human-computer interaction applications. In recent years, many research have been performed to explore gaze estimation methods to handle free-head movement, most of which focused on gaze direction estimation. Gaze point estimation on the screen is another important application. In this paper, we proposed a two-step training network, called GazeEstimator, to improve the estimation accuracy of gaze location on mobile devices. The first step is to train an eye landmarks localization network on 300W-LP dataset [1], and the second step is to train a gaze estimation network on GazeCapture dataset [2]. Some processing operations are performed between the two networks for data cleaning. The first network is able to localize eye precisely on the image, while the gaze estimation network use only eye images and eye grids as inputs, and it is robust to facial expressions and occlusion.Compared with state-of-the-art gaze estimation method, iTracker, our proposed deep network achieves higher accuracy and is able to estimate gaze location even in the condition that the full face cannot be detected.
移动设备上基于自由头部外观的眼球注视估计
眼注视跟踪在人机交互应用中起着重要的作用。近年来,针对自由头部运动的注视估计方法进行了大量的研究,但大多集中在注视方向估计上。屏幕上的注视点估计是另一个重要的应用。为了提高移动设备上注视位置的估计精度,本文提出了一种两步训练网络GazeEstimator。第一步是在300W-LP数据集上训练眼睛地标定位网络[1],第二步是在GazeCapture数据集上训练凝视估计网络[2]。在两个网络之间执行一些处理操作,进行数据清洗。第一个网络能够在图像上精确定位眼睛,而注视估计网络仅使用眼睛图像和眼睛网格作为输入,并且对面部表情和遮挡具有鲁棒性。与目前最先进的注视估计方法iTracker相比,我们提出的深度网络具有更高的精度,即使在无法检测到整个面部的情况下也能估计出注视位置。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信