{"title":"头戴式凝视跟踪器的实时算法","authors":"A. Starostenko, Filipp Kozin, R. Gorbachev","doi":"10.1109/IC-AIAI48757.2019.00025","DOIUrl":null,"url":null,"abstract":"We introduce a set of real-time algorithms for head mounted gaze tracker consisting of three cameras: two cameras for the eyes and one camera for the scene. The direction of the optical axis of the eye in three-dimensional space is calculated using the reflection of IR LEDs from the cornea. Individual features of the user are taken into account using the short-term calibration procedure. The described algorithms combine high accuracy in determining the point of gaze with high speed. The procedure for determining the point of gaze consists of the following algorithms: estimation of the position of the pupils on the eye cameras frames using of the threshold processing taking into account the histogram of the frame and further approximation of the pupil by an ellipse; estimation of the IR LEDs glare position on the frames of the eye cameras using threshold processing; filtration of the glares by brightness, size, circularity, and of the glares beyond the iris, the size of the iris is estimated by the distance from eye camera to pupil position calculated on the previous frame; indexation of the glares with the template matching; estimation of the optical axis angles of the eye using a spherical model of the cornea with the nonlinear optimization methods; estimation of the point of gaze on the scene camera frame using individual user features found during the calibration process. During calibration, the movement of the ArUco calibration mark and its selection on the scene camera frame are used. To calculate the gaze position on the scene camera, a regression algorithm is used, which implicitly takes into account the individual characteristics of the user.","PeriodicalId":374193,"journal":{"name":"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Real-Time Algorithms for Head Mounted Gaze Tracker\",\"authors\":\"A. Starostenko, Filipp Kozin, R. Gorbachev\",\"doi\":\"10.1109/IC-AIAI48757.2019.00025\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We introduce a set of real-time algorithms for head mounted gaze tracker consisting of three cameras: two cameras for the eyes and one camera for the scene. The direction of the optical axis of the eye in three-dimensional space is calculated using the reflection of IR LEDs from the cornea. Individual features of the user are taken into account using the short-term calibration procedure. The described algorithms combine high accuracy in determining the point of gaze with high speed. The procedure for determining the point of gaze consists of the following algorithms: estimation of the position of the pupils on the eye cameras frames using of the threshold processing taking into account the histogram of the frame and further approximation of the pupil by an ellipse; estimation of the IR LEDs glare position on the frames of the eye cameras using threshold processing; filtration of the glares by brightness, size, circularity, and of the glares beyond the iris, the size of the iris is estimated by the distance from eye camera to pupil position calculated on the previous frame; indexation of the glares with the template matching; estimation of the optical axis angles of the eye using a spherical model of the cornea with the nonlinear optimization methods; estimation of the point of gaze on the scene camera frame using individual user features found during the calibration process. During calibration, the movement of the ArUco calibration mark and its selection on the scene camera frame are used. To calculate the gaze position on the scene camera, a regression algorithm is used, which implicitly takes into account the individual characteristics of the user.\",\"PeriodicalId\":374193,\"journal\":{\"name\":\"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)\",\"volume\":\"4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IC-AIAI48757.2019.00025\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC-AIAI48757.2019.00025","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-Time Algorithms for Head Mounted Gaze Tracker
We introduce a set of real-time algorithms for head mounted gaze tracker consisting of three cameras: two cameras for the eyes and one camera for the scene. The direction of the optical axis of the eye in three-dimensional space is calculated using the reflection of IR LEDs from the cornea. Individual features of the user are taken into account using the short-term calibration procedure. The described algorithms combine high accuracy in determining the point of gaze with high speed. The procedure for determining the point of gaze consists of the following algorithms: estimation of the position of the pupils on the eye cameras frames using of the threshold processing taking into account the histogram of the frame and further approximation of the pupil by an ellipse; estimation of the IR LEDs glare position on the frames of the eye cameras using threshold processing; filtration of the glares by brightness, size, circularity, and of the glares beyond the iris, the size of the iris is estimated by the distance from eye camera to pupil position calculated on the previous frame; indexation of the glares with the template matching; estimation of the optical axis angles of the eye using a spherical model of the cornea with the nonlinear optimization methods; estimation of the point of gaze on the scene camera frame using individual user features found during the calibration process. During calibration, the movement of the ArUco calibration mark and its selection on the scene camera frame are used. To calculate the gaze position on the scene camera, a regression algorithm is used, which implicitly takes into account the individual characteristics of the user.