{"title":"在无头远程眼注视检测系统中使用移动视觉目标进行眼注视校准","authors":"Yukiyoshi Kondou, Y. Ebisawa","doi":"10.1109/VECIMS.2008.4592770","DOIUrl":null,"url":null,"abstract":"The development of precise head-free remote eye-gaze detection devices with easy calibration is desired for human interface and human monitoring. In the system that we developed, the 3D position of a pupil is measured using stereo wide view cameras. Next, a narrow view camera unit detecting the center of the pupil and the corneal reflection of the light source is aimed at the subject eye pupil using a pan-tilt drive. Previously, we proposed an eye-gaze detection theory in which a subject must look into the center of the aperture of a narrow view camera and at a small marker on a PC screen (two-point calibration). In our past experiments conducted with such calibration at the position of approx. 70 cm from a PC screen, the subjects shifted head positions approx. 15 cm in depth and 12 cm to lateral. The results of our experiments have already confirmed the usefulness of our two-point calibration theory and the developed system. However, unlike looking at the marker on the PC screen, it was difficult during our experiments for the subjects to look at the center of the camera at the center of the aperture because the position was unclear and it was difficult to provide the timing for the subject to look at it. In this paper, we propose a calibration method by looking at just two known points other than the camera. Furthermore, we propose a calibration method using a moving visual target on the PC screen. The experimental results show that the eye gaze precision is improved by using the moving target, rather than by using the two points on the screen or by using the previous calibration method; and, the time necessary for calibration is not different.","PeriodicalId":284224,"journal":{"name":"2008 IEEE Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":"{\"title\":\"Easy eye-gaze calibration using a moving visual target in the head-free remote eye-gaze detection system\",\"authors\":\"Yukiyoshi Kondou, Y. Ebisawa\",\"doi\":\"10.1109/VECIMS.2008.4592770\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The development of precise head-free remote eye-gaze detection devices with easy calibration is desired for human interface and human monitoring. In the system that we developed, the 3D position of a pupil is measured using stereo wide view cameras. Next, a narrow view camera unit detecting the center of the pupil and the corneal reflection of the light source is aimed at the subject eye pupil using a pan-tilt drive. Previously, we proposed an eye-gaze detection theory in which a subject must look into the center of the aperture of a narrow view camera and at a small marker on a PC screen (two-point calibration). In our past experiments conducted with such calibration at the position of approx. 70 cm from a PC screen, the subjects shifted head positions approx. 15 cm in depth and 12 cm to lateral. The results of our experiments have already confirmed the usefulness of our two-point calibration theory and the developed system. However, unlike looking at the marker on the PC screen, it was difficult during our experiments for the subjects to look at the center of the camera at the center of the aperture because the position was unclear and it was difficult to provide the timing for the subject to look at it. In this paper, we propose a calibration method by looking at just two known points other than the camera. Furthermore, we propose a calibration method using a moving visual target on the PC screen. The experimental results show that the eye gaze precision is improved by using the moving target, rather than by using the two points on the screen or by using the previous calibration method; and, the time necessary for calibration is not different.\",\"PeriodicalId\":284224,\"journal\":{\"name\":\"2008 IEEE Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-07-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"21\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 IEEE Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/VECIMS.2008.4592770\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 IEEE Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VECIMS.2008.4592770","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Easy eye-gaze calibration using a moving visual target in the head-free remote eye-gaze detection system
The development of precise head-free remote eye-gaze detection devices with easy calibration is desired for human interface and human monitoring. In the system that we developed, the 3D position of a pupil is measured using stereo wide view cameras. Next, a narrow view camera unit detecting the center of the pupil and the corneal reflection of the light source is aimed at the subject eye pupil using a pan-tilt drive. Previously, we proposed an eye-gaze detection theory in which a subject must look into the center of the aperture of a narrow view camera and at a small marker on a PC screen (two-point calibration). In our past experiments conducted with such calibration at the position of approx. 70 cm from a PC screen, the subjects shifted head positions approx. 15 cm in depth and 12 cm to lateral. The results of our experiments have already confirmed the usefulness of our two-point calibration theory and the developed system. However, unlike looking at the marker on the PC screen, it was difficult during our experiments for the subjects to look at the center of the camera at the center of the aperture because the position was unclear and it was difficult to provide the timing for the subject to look at it. In this paper, we propose a calibration method by looking at just two known points other than the camera. Furthermore, we propose a calibration method using a moving visual target on the PC screen. The experimental results show that the eye gaze precision is improved by using the moving target, rather than by using the two points on the screen or by using the previous calibration method; and, the time necessary for calibration is not different.