{"title":"任务嵌入式在线眼动仪校准提高头部运动鲁棒性","authors":"Jimin Pi, Bertram E. Shi","doi":"10.1145/3314111.3319845","DOIUrl":null,"url":null,"abstract":"Remote eye trackers are widely used for screen-based interactions. They are less intrusive than head mounted eye trackers, but are generally quite sensitive to head movement. This leads to the requirement for frequent recalibration, especially in applications requiring accurate eye tracking. We propose here an online calibration method to compensate for head movements if estimates of the gaze targets are available. For example, in dwell-time based gaze typing it is reasonable to assume that for correct selections, the user's gaze target during the dwell-time was at the key center. We use this assumption to derive an eye-position dependent linear transformation matrix for correcting the measured gaze. Our experiments show that the proposed method significantly reduces errors over a large range of head movements.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Task-embedded online eye-tracker calibration for improving robustness to head motion\",\"authors\":\"Jimin Pi, Bertram E. Shi\",\"doi\":\"10.1145/3314111.3319845\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Remote eye trackers are widely used for screen-based interactions. They are less intrusive than head mounted eye trackers, but are generally quite sensitive to head movement. This leads to the requirement for frequent recalibration, especially in applications requiring accurate eye tracking. We propose here an online calibration method to compensate for head movements if estimates of the gaze targets are available. For example, in dwell-time based gaze typing it is reasonable to assume that for correct selections, the user's gaze target during the dwell-time was at the key center. We use this assumption to derive an eye-position dependent linear transformation matrix for correcting the measured gaze. Our experiments show that the proposed method significantly reduces errors over a large range of head movements.\",\"PeriodicalId\":161901,\"journal\":{\"name\":\"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications\",\"volume\":\"22 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-06-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3314111.3319845\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3314111.3319845","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Task-embedded online eye-tracker calibration for improving robustness to head motion
Remote eye trackers are widely used for screen-based interactions. They are less intrusive than head mounted eye trackers, but are generally quite sensitive to head movement. This leads to the requirement for frequent recalibration, especially in applications requiring accurate eye tracking. We propose here an online calibration method to compensate for head movements if estimates of the gaze targets are available. For example, in dwell-time based gaze typing it is reasonable to assume that for correct selections, the user's gaze target during the dwell-time was at the key center. We use this assumption to derive an eye-position dependent linear transformation matrix for correcting the measured gaze. Our experiments show that the proposed method significantly reduces errors over a large range of head movements.