一种新的基于DE-SLFA的无标定注视跟踪算法

Song Wang, Junning Wang, Hongming Peng, Shupin Gao, Di He
{"title":"一种新的基于DE-SLFA的无标定注视跟踪算法","authors":"Song Wang, Junning Wang, Hongming Peng, Shupin Gao, Di He","doi":"10.1109/ITME.2016.0091","DOIUrl":null,"url":null,"abstract":"Advanced remote gaze estimation systems use automatic calibration procedure without requiring active user involving into the estimation of subject-specific eye parameters. Though automatic calibration process can simplify the difficulty of calibration task, it still needs time to collect information for completing the eye parameters of users before the gaze tracking system is used. This paper proposes a novel method, free of calibration procedure to extract subject-specific eye parameters. To estimate the real-time angles between the optical and visual axes of each eye before calculating the direction of the visual axes of the both the left and right eyes, differential evolution and Shuffled Frog-leaping Algorithm (DE-SLFA) is used to minimize the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display. As a consequence, the inconvenient calibration procedure which may produce possible calibration errors can be eliminated. Computer simulation have been performed to confirm the proposed method.","PeriodicalId":184905,"journal":{"name":"2016 8th International Conference on Information Technology in Medicine and Education (ITME)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A New Calibration-Free Gaze Tracking Algorithm Based on DE-SLFA\",\"authors\":\"Song Wang, Junning Wang, Hongming Peng, Shupin Gao, Di He\",\"doi\":\"10.1109/ITME.2016.0091\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Advanced remote gaze estimation systems use automatic calibration procedure without requiring active user involving into the estimation of subject-specific eye parameters. Though automatic calibration process can simplify the difficulty of calibration task, it still needs time to collect information for completing the eye parameters of users before the gaze tracking system is used. This paper proposes a novel method, free of calibration procedure to extract subject-specific eye parameters. To estimate the real-time angles between the optical and visual axes of each eye before calculating the direction of the visual axes of the both the left and right eyes, differential evolution and Shuffled Frog-leaping Algorithm (DE-SLFA) is used to minimize the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display. As a consequence, the inconvenient calibration procedure which may produce possible calibration errors can be eliminated. Computer simulation have been performed to confirm the proposed method.\",\"PeriodicalId\":184905,\"journal\":{\"name\":\"2016 8th International Conference on Information Technology in Medicine and Education (ITME)\",\"volume\":\"5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 8th International Conference on Information Technology in Medicine and Education (ITME)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITME.2016.0091\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 8th International Conference on Information Technology in Medicine and Education (ITME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITME.2016.0091","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

先进的远程凝视估计系统使用自动校准程序,而不需要主动用户参与受试者特定眼睛参数的估计。虽然自动校准过程可以简化校准任务的难度,但在使用注视跟踪系统之前,仍然需要时间收集信息以完成用户的眼睛参数。本文提出了一种新的方法,无需校准程序来提取受试者特定的眼睛参数。为了在计算左右眼视觉轴方向之前实时估计每只眼睛的光学轴和视觉轴之间的角度,在受试者自然观看显示器时,使用差分进化和shuffle青蛙跳跃算法(DE-SLFA)最小化左右眼视觉轴交点与显示器表面之间的距离。因此,可以消除可能产生校准误差的不方便的校准过程。计算机仿真验证了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A New Calibration-Free Gaze Tracking Algorithm Based on DE-SLFA
Advanced remote gaze estimation systems use automatic calibration procedure without requiring active user involving into the estimation of subject-specific eye parameters. Though automatic calibration process can simplify the difficulty of calibration task, it still needs time to collect information for completing the eye parameters of users before the gaze tracking system is used. This paper proposes a novel method, free of calibration procedure to extract subject-specific eye parameters. To estimate the real-time angles between the optical and visual axes of each eye before calculating the direction of the visual axes of the both the left and right eyes, differential evolution and Shuffled Frog-leaping Algorithm (DE-SLFA) is used to minimize the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display. As a consequence, the inconvenient calibration procedure which may produce possible calibration errors can be eliminated. Computer simulation have been performed to confirm the proposed method.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信