{"title":"Emotion Recognition Using Frontal EEG in VR Affective Scenes","authors":"Tianyuan Xu, Rui-Xiang Yin, Lin Shu, Xiangmin Xu","doi":"10.1109/IMBIOC.2019.8777843","DOIUrl":null,"url":null,"abstract":"Frontal EEG has been widely used for human emotion recognition since its convenience. However, many relevant studies used traditional wet electrodes to collect EEG signals and the stimulation ways were restricted as music, videos and pictures. This paper provides a new framework for emotion recognition using frontal EEG and VR affective scenes. An experiment about VR stimuli EEG data collection was conducted among 19 subjects. The EEG data were collected using textile dry electrodes. EEG features were extracted from time, frequency and space domain in the collected data. Model stacking method were applied in the experiment to ensemble 3 models including GBDT, RF and SVM. The mean accuracy of our framework achieved about 81.30%, which exhibited better performance compared with relevant studies. The framework proposed in this work can be well applied to wearable device for EEG emotion recognition in VR scenes.","PeriodicalId":171472,"journal":{"name":"2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IMBIOC.2019.8777843","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15
Abstract
Frontal EEG has been widely used for human emotion recognition since its convenience. However, many relevant studies used traditional wet electrodes to collect EEG signals and the stimulation ways were restricted as music, videos and pictures. This paper provides a new framework for emotion recognition using frontal EEG and VR affective scenes. An experiment about VR stimuli EEG data collection was conducted among 19 subjects. The EEG data were collected using textile dry electrodes. EEG features were extracted from time, frequency and space domain in the collected data. Model stacking method were applied in the experiment to ensemble 3 models including GBDT, RF and SVM. The mean accuracy of our framework achieved about 81.30%, which exhibited better performance compared with relevant studies. The framework proposed in this work can be well applied to wearable device for EEG emotion recognition in VR scenes.