Bao-Wei Chu, Wei-Liang Ou, Robert Chen-Hao Chang, Chih-Peng Fan
{"title":"Calibration-Free Gaze Estimation by Combination with Hand and Facial Features Detection for Interactive Advertising Display","authors":"Bao-Wei Chu, Wei-Liang Ou, Robert Chen-Hao Chang, Chih-Peng Fan","doi":"10.1109/ICCE59016.2024.10444285","DOIUrl":null,"url":null,"abstract":"In recent years, in addition to droplet infection, hand infection is also one of the major transmission routes of the new crown epidemic. Therefore, smart advertising machines that can be operated without hand contact are an important research topic. Gaze estimation is a technology that can identify the direction of gaze, which can infer the customer’s visual attention and convert it into interactive input information. Gaze estimation has great potential in contactless interactions as it allows systems to detect gaze focus and display relevant information/advertisements based on customer interests. The proposed design methodology is divided into three steps: face/hand objects detection by YOLO, threshold estimation by feature points, and gaze region detection by SVM classifier. The experimental results show that the average FPS with YOLO-based model reaches 15 when the number of filters is reduced to a quarter and the input size is set to 416x416 pixels. In the case of 4-block gaze regions, the YOLO-based model maintains a good enough accuracy that is up to 90%, and the experimental results reveals that the proposed expectation by adding hand features can effectively raise the accuracy of gaze estimation.","PeriodicalId":518694,"journal":{"name":"2024 IEEE International Conference on Consumer Electronics (ICCE)","volume":"66 9","pages":"1-3"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2024 IEEE International Conference on Consumer Electronics (ICCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCE59016.2024.10444285","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, in addition to droplet infection, hand infection is also one of the major transmission routes of the new crown epidemic. Therefore, smart advertising machines that can be operated without hand contact are an important research topic. Gaze estimation is a technology that can identify the direction of gaze, which can infer the customer’s visual attention and convert it into interactive input information. Gaze estimation has great potential in contactless interactions as it allows systems to detect gaze focus and display relevant information/advertisements based on customer interests. The proposed design methodology is divided into three steps: face/hand objects detection by YOLO, threshold estimation by feature points, and gaze region detection by SVM classifier. The experimental results show that the average FPS with YOLO-based model reaches 15 when the number of filters is reduced to a quarter and the input size is set to 416x416 pixels. In the case of 4-block gaze regions, the YOLO-based model maintains a good enough accuracy that is up to 90%, and the experimental results reveals that the proposed expectation by adding hand features can effectively raise the accuracy of gaze estimation.