{"title":"Accuracy Improvement of Object Selection in Gaze Gesture Application using Deep Learning","authors":"M. Alfaroby E., S. Wibirama, I. Ardiyanto","doi":"10.1109/ICITEE49829.2020.9271771","DOIUrl":null,"url":null,"abstract":"Gaze-based interaction is a crucial research area. Gaze gesture provides faster interaction between a user and a computer application because people naturally look at the object of interest before taking any other actions. Spontaneous gaze-gesture-based application uses gaze-gesture as an input modality without performing any calibration. The conventional eye tracking systems have a problem with low accuracy. In general, data captured by eye tracker contains errors and noise within gaze position signal. The errors and noise affect the performance of object selection in gaze gesture based application that controls digital contents on the display using smooth-pursuit eye movement. The conventional object selection method suffers from low accuracy (<80%). In this paper, we addressed this accuracy problem with a novel approach using deep learning. We exploited deep learning power to recognize the pattern of eye-gaze data. Long Short Term Memory (LSTM) is a deep learning architecture based on recurrent neural network (RNN). We used LSTM to perform object selection task. The dataset consisted of 34 participants taken from previous study of object selection technique of gaze gesture-based application. Our experimental results show that the proposed method achieved 96.17% of accuracy. In future, our result may be used as a guidance for developing gaze gesture application.","PeriodicalId":245013,"journal":{"name":"2020 12th International Conference on Information Technology and Electrical Engineering (ICITEE)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 12th International Conference on Information Technology and Electrical Engineering (ICITEE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICITEE49829.2020.9271771","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Gaze-based interaction is a crucial research area. Gaze gesture provides faster interaction between a user and a computer application because people naturally look at the object of interest before taking any other actions. Spontaneous gaze-gesture-based application uses gaze-gesture as an input modality without performing any calibration. The conventional eye tracking systems have a problem with low accuracy. In general, data captured by eye tracker contains errors and noise within gaze position signal. The errors and noise affect the performance of object selection in gaze gesture based application that controls digital contents on the display using smooth-pursuit eye movement. The conventional object selection method suffers from low accuracy (<80%). In this paper, we addressed this accuracy problem with a novel approach using deep learning. We exploited deep learning power to recognize the pattern of eye-gaze data. Long Short Term Memory (LSTM) is a deep learning architecture based on recurrent neural network (RNN). We used LSTM to perform object selection task. The dataset consisted of 34 participants taken from previous study of object selection technique of gaze gesture-based application. Our experimental results show that the proposed method achieved 96.17% of accuracy. In future, our result may be used as a guidance for developing gaze gesture application.