Mahtab Alam, M. Raihan, Mubtasim Rafid Chowdhury, A. Shams
{"title":"High Precision Eye Tracking Based on Electrooculography (EOG) Signal Using Artificial Neural Network (ANN) for Smart Technology Application","authors":"Mahtab Alam, M. Raihan, Mubtasim Rafid Chowdhury, A. Shams","doi":"10.1109/ICCIT54785.2021.9689821","DOIUrl":null,"url":null,"abstract":"Electrooculography (EOG) signal is the potential difference between the cornea and the retina of the eye. The voltage amplitude changes when the eye moves in various directions. This change produces a distinct EOG pattern when the eye moves in a particular direction. Therefore, by monitoring the EOG signal, it is possible to track the eye movement. The EOG based eye-tracking technique can be extended to maneuver smart wheelchairs for neurodegenerative disease patients. For a successful operation of such a smart wheelchair, an accurate classification of the EOG signal is required. In this experimental study, we collected two channel EOG signals in the laboratory from multiple individuals and propose an Artificial Neural Network (ANN) based method to differentiate among the nine classes of EOG signals: up, down, left, right, down-left, down-right, up-left, up-right, and blink. This wide range classification would be suitable to perform complicated tasks in smart technology platform. Our model can successfully predict the eye movement from the statistical properties and dominant frequency of the measured EOG signal with an accuracy, precision, recall, and F1 score of 99%. This is a significant improvement over past studies conducted by various researchers for the same purpose and to the knowledge of the authors, such a high accuracy has not been previously achieved for the nine classes of EOG signals mentioned earlier. The proposed model is compatible for real-time smart applications based on eye movements.","PeriodicalId":166450,"journal":{"name":"2021 24th International Conference on Computer and Information Technology (ICCIT)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 24th International Conference on Computer and Information Technology (ICCIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCIT54785.2021.9689821","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Electrooculography (EOG) signal is the potential difference between the cornea and the retina of the eye. The voltage amplitude changes when the eye moves in various directions. This change produces a distinct EOG pattern when the eye moves in a particular direction. Therefore, by monitoring the EOG signal, it is possible to track the eye movement. The EOG based eye-tracking technique can be extended to maneuver smart wheelchairs for neurodegenerative disease patients. For a successful operation of such a smart wheelchair, an accurate classification of the EOG signal is required. In this experimental study, we collected two channel EOG signals in the laboratory from multiple individuals and propose an Artificial Neural Network (ANN) based method to differentiate among the nine classes of EOG signals: up, down, left, right, down-left, down-right, up-left, up-right, and blink. This wide range classification would be suitable to perform complicated tasks in smart technology platform. Our model can successfully predict the eye movement from the statistical properties and dominant frequency of the measured EOG signal with an accuracy, precision, recall, and F1 score of 99%. This is a significant improvement over past studies conducted by various researchers for the same purpose and to the knowledge of the authors, such a high accuracy has not been previously achieved for the nine classes of EOG signals mentioned earlier. The proposed model is compatible for real-time smart applications based on eye movements.