基于额叶脑电图的VR情感场景情感识别

Tianyuan Xu, Rui-Xiang Yin, Lin Shu, Xiangmin Xu
{"title":"基于额叶脑电图的VR情感场景情感识别","authors":"Tianyuan Xu, Rui-Xiang Yin, Lin Shu, Xiangmin Xu","doi":"10.1109/IMBIOC.2019.8777843","DOIUrl":null,"url":null,"abstract":"Frontal EEG has been widely used for human emotion recognition since its convenience. However, many relevant studies used traditional wet electrodes to collect EEG signals and the stimulation ways were restricted as music, videos and pictures. This paper provides a new framework for emotion recognition using frontal EEG and VR affective scenes. An experiment about VR stimuli EEG data collection was conducted among 19 subjects. The EEG data were collected using textile dry electrodes. EEG features were extracted from time, frequency and space domain in the collected data. Model stacking method were applied in the experiment to ensemble 3 models including GBDT, RF and SVM. The mean accuracy of our framework achieved about 81.30%, which exhibited better performance compared with relevant studies. The framework proposed in this work can be well applied to wearable device for EEG emotion recognition in VR scenes.","PeriodicalId":171472,"journal":{"name":"2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":"{\"title\":\"Emotion Recognition Using Frontal EEG in VR Affective Scenes\",\"authors\":\"Tianyuan Xu, Rui-Xiang Yin, Lin Shu, Xiangmin Xu\",\"doi\":\"10.1109/IMBIOC.2019.8777843\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Frontal EEG has been widely used for human emotion recognition since its convenience. However, many relevant studies used traditional wet electrodes to collect EEG signals and the stimulation ways were restricted as music, videos and pictures. This paper provides a new framework for emotion recognition using frontal EEG and VR affective scenes. An experiment about VR stimuli EEG data collection was conducted among 19 subjects. The EEG data were collected using textile dry electrodes. EEG features were extracted from time, frequency and space domain in the collected data. Model stacking method were applied in the experiment to ensemble 3 models including GBDT, RF and SVM. The mean accuracy of our framework achieved about 81.30%, which exhibited better performance compared with relevant studies. The framework proposed in this work can be well applied to wearable device for EEG emotion recognition in VR scenes.\",\"PeriodicalId\":171472,\"journal\":{\"name\":\"2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-05-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IMBIOC.2019.8777843\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE MTT-S International Microwave Biomedical Conference (IMBioC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IMBIOC.2019.8777843","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15

摘要

额叶脑电图以其简便的特点被广泛应用于人类情感识别。然而,许多相关研究采用传统的湿电极采集脑电信号,刺激方式仅限于音乐、视频和图片。本文提出了一种利用额叶脑电图和虚拟现实情感场景进行情感识别的新框架。对19名被试进行了虚拟现实刺激脑电数据采集实验。采用纺织干电极采集脑电数据。从采集数据的时域、频域和空域提取脑电特征。实验中采用模型叠加法对GBDT、RF和SVM 3种模型进行集成。我们的框架平均准确率达到81.30%左右,与相关研究相比,表现出更好的性能。本文提出的框架可以很好地应用于VR场景下可穿戴设备的EEG情绪识别。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Emotion Recognition Using Frontal EEG in VR Affective Scenes
Frontal EEG has been widely used for human emotion recognition since its convenience. However, many relevant studies used traditional wet electrodes to collect EEG signals and the stimulation ways were restricted as music, videos and pictures. This paper provides a new framework for emotion recognition using frontal EEG and VR affective scenes. An experiment about VR stimuli EEG data collection was conducted among 19 subjects. The EEG data were collected using textile dry electrodes. EEG features were extracted from time, frequency and space domain in the collected data. Model stacking method were applied in the experiment to ensemble 3 models including GBDT, RF and SVM. The mean accuracy of our framework achieved about 81.30%, which exhibited better performance compared with relevant studies. The framework proposed in this work can be well applied to wearable device for EEG emotion recognition in VR scenes.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信