{"title":"以虹膜为基本特征的眼姿估计与跟踪","authors":"Dmitry Shmunk","doi":"10.1109/OJID.2025.3616949","DOIUrl":null,"url":null,"abstract":"A novel, fast, and robust method for 3D eye pose tracking that leverages the anatomical constancy of the human iris to improve accuracy and computational efficiency is proposed. Traditional pupil-based methods suffer from limitations due to pupil size variability, decentering, and the need for complex corrections for refraction through the corneal bulge. In contrast, the iris, due to its fixed size and direct visibility, serves as a more reliable feature for precise eye pose estimation. Our method combines key advantages of both model-based and regression-based approaches without requiring external glint-producing light sources or high computational overheads associated with neural-network-based solutions. The iris is used as the primary tracking feature, enabling robust detection even under partial occlusion and in users wearing prescription eyewear. Exploiting the consistent geometry of the iris, we estimate gaze direction and 3D eye position with high precision. Unlike existing methods, the proposed approach minimizes reliance on pupil measurements, employing the pupil’s high contrast only to augment iris detection. This strategy ensures robustness in real-world scenarios, including varying illumination and stray light/glints/distortions introduced by corrective eyewear. Experimental results show that the method achieves low computational cost while maintaining state-of-the-art performance.","PeriodicalId":100634,"journal":{"name":"IEEE Open Journal on Immersive Displays","volume":"2 ","pages":"96-105"},"PeriodicalIF":0.0000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11189046","citationCount":"0","resultStr":"{\"title\":\"Eye Pose Estimation and Tracking Using Iris as a Base Feature\",\"authors\":\"Dmitry Shmunk\",\"doi\":\"10.1109/OJID.2025.3616949\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A novel, fast, and robust method for 3D eye pose tracking that leverages the anatomical constancy of the human iris to improve accuracy and computational efficiency is proposed. Traditional pupil-based methods suffer from limitations due to pupil size variability, decentering, and the need for complex corrections for refraction through the corneal bulge. In contrast, the iris, due to its fixed size and direct visibility, serves as a more reliable feature for precise eye pose estimation. Our method combines key advantages of both model-based and regression-based approaches without requiring external glint-producing light sources or high computational overheads associated with neural-network-based solutions. The iris is used as the primary tracking feature, enabling robust detection even under partial occlusion and in users wearing prescription eyewear. Exploiting the consistent geometry of the iris, we estimate gaze direction and 3D eye position with high precision. Unlike existing methods, the proposed approach minimizes reliance on pupil measurements, employing the pupil’s high contrast only to augment iris detection. This strategy ensures robustness in real-world scenarios, including varying illumination and stray light/glints/distortions introduced by corrective eyewear. Experimental results show that the method achieves low computational cost while maintaining state-of-the-art performance.\",\"PeriodicalId\":100634,\"journal\":{\"name\":\"IEEE Open Journal on Immersive Displays\",\"volume\":\"2 \",\"pages\":\"96-105\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11189046\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Open Journal on Immersive Displays\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11189046/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal on Immersive Displays","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11189046/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Eye Pose Estimation and Tracking Using Iris as a Base Feature
A novel, fast, and robust method for 3D eye pose tracking that leverages the anatomical constancy of the human iris to improve accuracy and computational efficiency is proposed. Traditional pupil-based methods suffer from limitations due to pupil size variability, decentering, and the need for complex corrections for refraction through the corneal bulge. In contrast, the iris, due to its fixed size and direct visibility, serves as a more reliable feature for precise eye pose estimation. Our method combines key advantages of both model-based and regression-based approaches without requiring external glint-producing light sources or high computational overheads associated with neural-network-based solutions. The iris is used as the primary tracking feature, enabling robust detection even under partial occlusion and in users wearing prescription eyewear. Exploiting the consistent geometry of the iris, we estimate gaze direction and 3D eye position with high precision. Unlike existing methods, the proposed approach minimizes reliance on pupil measurements, employing the pupil’s high contrast only to augment iris detection. This strategy ensures robustness in real-world scenarios, including varying illumination and stray light/glints/distortions introduced by corrective eyewear. Experimental results show that the method achieves low computational cost while maintaining state-of-the-art performance.