{"title":"Eye-Gaze to Screen Location Mapping for UI Evaluation of Webpages","authors":"M. S. Hossain, A. Ali, M. Amin","doi":"10.1145/3338472.3338483","DOIUrl":null,"url":null,"abstract":"This paper presents a way to track eye-gaze by using webcam and mapping the eye-gaze data compensating head pose and orientation on the display screen. First, we have shown a blank screen with red dots to 10 individuals and recorded their eye-gaze pattern and head orientation associated with that screen location by automated annotation. Then, we trained a neural network to learn the relationship between eye-gaze and head pose with screen location. The proposed method can map eye-gazes to screen with 68.3% accuracy. Next, by using the trained model to estimate eye gaze on screen, we have evaluated content of a website. This gives us an automated way to evaluate the UI of a website. The evaluation metric might be used with several other metrics to define a standard for web design and layout. This also gives insight to the likes and dislikes, important areas of a website. Also, eye tracking by only a webcam simplifies the matter to use this technology in various fields which might open the future prospect of enormous applications.","PeriodicalId":142573,"journal":{"name":"Proceedings of the 3rd International Conference on Graphics and Signal Processing","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd International Conference on Graphics and Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3338472.3338483","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
This paper presents a way to track eye-gaze by using webcam and mapping the eye-gaze data compensating head pose and orientation on the display screen. First, we have shown a blank screen with red dots to 10 individuals and recorded their eye-gaze pattern and head orientation associated with that screen location by automated annotation. Then, we trained a neural network to learn the relationship between eye-gaze and head pose with screen location. The proposed method can map eye-gazes to screen with 68.3% accuracy. Next, by using the trained model to estimate eye gaze on screen, we have evaluated content of a website. This gives us an automated way to evaluate the UI of a website. The evaluation metric might be used with several other metrics to define a standard for web design and layout. This also gives insight to the likes and dislikes, important areas of a website. Also, eye tracking by only a webcam simplifies the matter to use this technology in various fields which might open the future prospect of enormous applications.