Justin K. Bennett, S. Sridharan, Brendan David-John, Reynold J. Bailey
{"title":"Looking at faces: autonomous perspective invariant facial gaze analysis","authors":"Justin K. Bennett, S. Sridharan, Brendan David-John, Reynold J. Bailey","doi":"10.1145/2931002.2931005","DOIUrl":null,"url":null,"abstract":"Eye-tracking provides a mechanism for researchers to monitor where subjects deploy their visual attention. Eye-tracking has been used to gain insights into how humans scrutinize faces, however the majority of these studies were conducted using desktop-mounted eye-trackers where the subject sits and views a screen during the experiment. The stimuli in these experiments are typically photographs or videos of human faces. In this paper we present a novel approach using head-mounted eye-trackers which allows for automatic generation of gaze statistics for tasks performed in real-world environments. We use a trained hierarchy of Haar cascade classifiers to automatically detect and segment faces in the eye-tracker's scene camera video. We can then determine if fixations fall within the bounds of the face or other possible regions of interest and report relevant gaze statistics. Our method is easily adaptable to any feature-trained cascade to allow for rapid object detection and tracking. We compare our results with previous research on the perception of faces in social environments. We also explore correlations between gaze and confidence levels measured during a mock interview experiment.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Symposium on Applied Perception","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2931002.2931005","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Eye-tracking provides a mechanism for researchers to monitor where subjects deploy their visual attention. Eye-tracking has been used to gain insights into how humans scrutinize faces, however the majority of these studies were conducted using desktop-mounted eye-trackers where the subject sits and views a screen during the experiment. The stimuli in these experiments are typically photographs or videos of human faces. In this paper we present a novel approach using head-mounted eye-trackers which allows for automatic generation of gaze statistics for tasks performed in real-world environments. We use a trained hierarchy of Haar cascade classifiers to automatically detect and segment faces in the eye-tracker's scene camera video. We can then determine if fixations fall within the bounds of the face or other possible regions of interest and report relevant gaze statistics. Our method is easily adaptable to any feature-trained cascade to allow for rapid object detection and tracking. We compare our results with previous research on the perception of faces in social environments. We also explore correlations between gaze and confidence levels measured during a mock interview experiment.