{"title":"Investigation of Appropriate Classification Method for EOG Based Human Computer Interface","authors":"Muna Layth Abdulateef Al-Zubaidi, Selim Aras","doi":"10.1109/SIU55565.2022.9864953","DOIUrl":null,"url":null,"abstract":"The reason why real feelings and mood changes can be seen through our eyes is that the eyes provide the most revealing and accurate information of all human communication signs. It is possible to control a human-computer interface by voluntarily moving the eyes, which have an important place in communication. In this study, the appropriate feature and classification methods were investigated to use the Electooculography signs obtained from seven different voluntary eye movements in the human-computer interface. The success of the system is increased by determining the combination that gives the best result from many features by using the sequential forward feature selection method. The developed method reached 93.9% success in the seven-class dataset. The results show that human-computer interface control can be done with high accuracy with voluntary eye movements. Also, the development of a real-time working model is inspiring for work.","PeriodicalId":115446,"journal":{"name":"2022 30th Signal Processing and Communications Applications Conference (SIU)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 30th Signal Processing and Communications Applications Conference (SIU)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SIU55565.2022.9864953","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The reason why real feelings and mood changes can be seen through our eyes is that the eyes provide the most revealing and accurate information of all human communication signs. It is possible to control a human-computer interface by voluntarily moving the eyes, which have an important place in communication. In this study, the appropriate feature and classification methods were investigated to use the Electooculography signs obtained from seven different voluntary eye movements in the human-computer interface. The success of the system is increased by determining the combination that gives the best result from many features by using the sequential forward feature selection method. The developed method reached 93.9% success in the seven-class dataset. The results show that human-computer interface control can be done with high accuracy with voluntary eye movements. Also, the development of a real-time working model is inspiring for work.