基于并行混合卷积-递归神经网络的脑电情绪识别

Nursilva Aulianisa Putri, Esmeralda Contessa Djamal, Fikri Nugraha, Fatan Kasyidi
{"title":"基于并行混合卷积-递归神经网络的脑电情绪识别","authors":"Nursilva Aulianisa Putri, Esmeralda Contessa Djamal, Fikri Nugraha, Fatan Kasyidi","doi":"10.1109/ICoDSA55874.2022.9862853","DOIUrl":null,"url":null,"abstract":"Electroencephalogram (EEG) signals of certain emotions contain waves with specific frequency bands. So, emotion recognition uses the network containing each wave to become relevant. EEG signals record electrical activity in the brain from several channels. Therefore, EEG signal processing needs consideration to spatial and temporal. Spatial is a signal between channels, while temporal is a sequence. Several methods were used, Convolutional Neural Networks (CNN) with various dimensions, Recurrent Neural Networks (RNN), and hybrid CNN-RNN. This paper proposed a hybrid 2D CNN-RNN method for identifying emotions from a parallel network of each wave. Two-dimensional CNN is used in channel extraction in a short time of the signal. Using short-time signals is intended to minimize the non-stationary characteristic of EEG signals. Meanwhile, the identification of emotions is carried out with RNN using the output of 2D CNN extraction. The modeling and testing used a dataset from SEED, with three emotion classes: positive, neutral, and negative. The experimental results show that using a split network of each wave increased accuracy from 80.92% to 84.71% and a decreased Loss value. While the use of 2D CNN only increased a less significant accuracy than 1D CNN. Evaluation of the waves shows that Beta and Gamma waves provided the best precision, 87-91%, and Theta waves gave 79-85% precision. Alpha wave degrades overall performance, which only has 56-61% precision, considering it is a mid-wave between Theta and Beta. It is necessary to choose the proper weight updating technique. Adaptive Moment (Adam) increased accuracy than AdaDelta, AdaGrad, and RMSprop.","PeriodicalId":339135,"journal":{"name":"2022 International Conference on Data Science and Its Applications (ICoDSA)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"EEG Emotion Recognition using Parallel Hybrid Convolutional-Recurrent Neural Networks\",\"authors\":\"Nursilva Aulianisa Putri, Esmeralda Contessa Djamal, Fikri Nugraha, Fatan Kasyidi\",\"doi\":\"10.1109/ICoDSA55874.2022.9862853\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Electroencephalogram (EEG) signals of certain emotions contain waves with specific frequency bands. So, emotion recognition uses the network containing each wave to become relevant. EEG signals record electrical activity in the brain from several channels. Therefore, EEG signal processing needs consideration to spatial and temporal. Spatial is a signal between channels, while temporal is a sequence. Several methods were used, Convolutional Neural Networks (CNN) with various dimensions, Recurrent Neural Networks (RNN), and hybrid CNN-RNN. This paper proposed a hybrid 2D CNN-RNN method for identifying emotions from a parallel network of each wave. Two-dimensional CNN is used in channel extraction in a short time of the signal. Using short-time signals is intended to minimize the non-stationary characteristic of EEG signals. Meanwhile, the identification of emotions is carried out with RNN using the output of 2D CNN extraction. The modeling and testing used a dataset from SEED, with three emotion classes: positive, neutral, and negative. The experimental results show that using a split network of each wave increased accuracy from 80.92% to 84.71% and a decreased Loss value. While the use of 2D CNN only increased a less significant accuracy than 1D CNN. Evaluation of the waves shows that Beta and Gamma waves provided the best precision, 87-91%, and Theta waves gave 79-85% precision. Alpha wave degrades overall performance, which only has 56-61% precision, considering it is a mid-wave between Theta and Beta. It is necessary to choose the proper weight updating technique. Adaptive Moment (Adam) increased accuracy than AdaDelta, AdaGrad, and RMSprop.\",\"PeriodicalId\":339135,\"journal\":{\"name\":\"2022 International Conference on Data Science and Its Applications (ICoDSA)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Data Science and Its Applications (ICoDSA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICoDSA55874.2022.9862853\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Data Science and Its Applications (ICoDSA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICoDSA55874.2022.9862853","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

某些情绪的脑电图(EEG)信号包含特定频带的波。因此,情感识别使用包含每个波的网络来变得相关。脑电图信号通过几个通道记录大脑中的电活动。因此,脑电信号的处理需要兼顾空间和时间。空间是通道之间的信号,而时间是序列。采用了不同维数的卷积神经网络(CNN)、递归神经网络(RNN)和CNN-RNN混合方法。本文提出了一种混合二维CNN-RNN方法,用于从每个波的并行网络中识别情绪。利用二维CNN在短时间内对信号进行信道提取。利用短时信号是为了尽量减少脑电信号的非平稳特性。同时,利用二维CNN提取的输出,用RNN进行情绪识别。建模和测试使用了SEED的数据集,其中包含三种情绪类别:积极、中性和消极。实验结果表明,对每一波使用分离网络,准确率从80.92%提高到84.71%,Loss值降低。而使用2D CNN只比1D CNN提高了不太显著的精度。波的评估表明,β波和伽马波的精度最高,为87-91%,θ波的精度为79-85%。α波降低了整体性能,只有56-61%的精度,考虑到它是Theta和Beta之间的中波。选择合适的权值更新技术是必要的。自适应时刻(Adam)比AdaDelta, AdaGrad和RMSprop提高精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
EEG Emotion Recognition using Parallel Hybrid Convolutional-Recurrent Neural Networks
Electroencephalogram (EEG) signals of certain emotions contain waves with specific frequency bands. So, emotion recognition uses the network containing each wave to become relevant. EEG signals record electrical activity in the brain from several channels. Therefore, EEG signal processing needs consideration to spatial and temporal. Spatial is a signal between channels, while temporal is a sequence. Several methods were used, Convolutional Neural Networks (CNN) with various dimensions, Recurrent Neural Networks (RNN), and hybrid CNN-RNN. This paper proposed a hybrid 2D CNN-RNN method for identifying emotions from a parallel network of each wave. Two-dimensional CNN is used in channel extraction in a short time of the signal. Using short-time signals is intended to minimize the non-stationary characteristic of EEG signals. Meanwhile, the identification of emotions is carried out with RNN using the output of 2D CNN extraction. The modeling and testing used a dataset from SEED, with three emotion classes: positive, neutral, and negative. The experimental results show that using a split network of each wave increased accuracy from 80.92% to 84.71% and a decreased Loss value. While the use of 2D CNN only increased a less significant accuracy than 1D CNN. Evaluation of the waves shows that Beta and Gamma waves provided the best precision, 87-91%, and Theta waves gave 79-85% precision. Alpha wave degrades overall performance, which only has 56-61% precision, considering it is a mid-wave between Theta and Beta. It is necessary to choose the proper weight updating technique. Adaptive Moment (Adam) increased accuracy than AdaDelta, AdaGrad, and RMSprop.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信