Ashley Bishop, Anandha Sree Retnapandian, Sandhya Chengaiyan, K. Anandan
{"title":"Vowel Identification from Neural Signals during Articulated Speech","authors":"Ashley Bishop, Anandha Sree Retnapandian, Sandhya Chengaiyan, K. Anandan","doi":"10.1109/ICBSII49132.2020.9167550","DOIUrl":null,"url":null,"abstract":"Speech interfaces have become widely accepted and are nowadays integrated in various real-life applications and devices for the impaired. The recent advances in EEG technology has made Brain Computer Interface (BCI) the most exciting field of biomedical research. The non-invasive nature of EEG has made researchers show interest towards it. In this paper, the conventional brain connectivity measures are employed to recognize the articulated vowels from the recorded brain activity. The work reveals that the brain activity registered from the temporal and parietal regions contain brimming information related to speech production and comprehension. Also the analysis showed that the comprehended information related to speech activities are packed within the theta, beta and alpha EEG sub-bands. The extracted functional connectivity parameters were used to train a Multi layer perceptron to identify the articulated vowel.","PeriodicalId":133710,"journal":{"name":"2020 Sixth International Conference on Bio Signals, Images, and Instrumentation (ICBSII)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Sixth International Conference on Bio Signals, Images, and Instrumentation (ICBSII)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBSII49132.2020.9167550","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Speech interfaces have become widely accepted and are nowadays integrated in various real-life applications and devices for the impaired. The recent advances in EEG technology has made Brain Computer Interface (BCI) the most exciting field of biomedical research. The non-invasive nature of EEG has made researchers show interest towards it. In this paper, the conventional brain connectivity measures are employed to recognize the articulated vowels from the recorded brain activity. The work reveals that the brain activity registered from the temporal and parietal regions contain brimming information related to speech production and comprehension. Also the analysis showed that the comprehended information related to speech activities are packed within the theta, beta and alpha EEG sub-bands. The extracted functional connectivity parameters were used to train a Multi layer perceptron to identify the articulated vowel.