{"title":"An Electrooculogram Signal Based Control System in Offline Environment","authors":"Babita Thakur, P. Syal, P. Kumari","doi":"10.1145/3168776.3168794","DOIUrl":null,"url":null,"abstract":"Human Machine Interface (HMI) application based on Electrooculogram (EOG) signals for converting user intention into control command finds promising scope in development of prosthetic devices for persons suffering from motor impairment. In the present work, the EOG signals based control system has been investigated in offline environment. The signal has been acquired through g.LADYbird active electrodes placed at distinct positions on human face around the eyes. A classifier model has been trained by feature matrix which encapsulates the time domain features extracted by using Dual Tree Complex Wavelet Transform (DTCWT). Linear Support Vector Machine (SVM) classifier has been used to develop a classified trained model by using 240 training data sets recorded from 12 healthy subjects. The MATLAB simulation showed 99.2% classification accuracy for horizontal eye movement in two directions, left and right. The classified signals have been converted into commands through Arduino to grasp and release an object by prosthetic myoelectric hand.","PeriodicalId":253305,"journal":{"name":"Proceedings of the 2017 4th International Conference on Biomedical and Bioinformatics Engineering","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2017 4th International Conference on Biomedical and Bioinformatics Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3168776.3168794","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Human Machine Interface (HMI) application based on Electrooculogram (EOG) signals for converting user intention into control command finds promising scope in development of prosthetic devices for persons suffering from motor impairment. In the present work, the EOG signals based control system has been investigated in offline environment. The signal has been acquired through g.LADYbird active electrodes placed at distinct positions on human face around the eyes. A classifier model has been trained by feature matrix which encapsulates the time domain features extracted by using Dual Tree Complex Wavelet Transform (DTCWT). Linear Support Vector Machine (SVM) classifier has been used to develop a classified trained model by using 240 training data sets recorded from 12 healthy subjects. The MATLAB simulation showed 99.2% classification accuracy for horizontal eye movement in two directions, left and right. The classified signals have been converted into commands through Arduino to grasp and release an object by prosthetic myoelectric hand.