Subject-Independent Emotion Recognition During Music Listening Based on EEG Using Deep Convolutional Neural Networks

Panayu Keelawat, Nattapong Thammasan, B. Kijsirikul, M. Numao
{"title":"Subject-Independent Emotion Recognition During Music Listening Based on EEG Using Deep Convolutional Neural Networks","authors":"Panayu Keelawat, Nattapong Thammasan, B. Kijsirikul, M. Numao","doi":"10.1109/CSPA.2019.8696054","DOIUrl":null,"url":null,"abstract":"Emotion recognition during music listening using electroencephalogram (EEG) has gained more attention from researchers, recently. Many studies focused on accuracy on one subject while subject-independent performance evaluation was still unclear. In this paper, the objective is to create an emotion recognition model that can be applied to multiple subjects. By adopting convolutional neural networks (CNNs), advantage could be gained from utilizing information from electrodes and time steps. Using CNNs also does not need feature extraction which might leave out other related but unobserved features. CNNs with three to seven convolutional layers were deployed in this research. We measured their performance with a binary classification task for compositions of emotions including arousal and valence. The results showed that our method captured EEG signal patterns from numerous subjects by 10-fold cross validation with 81.54% and 86.87% accuracy from arousal and valence respectively. The method also showed a higher capability of generalization to unseen subjects than the previous method as can be observed from the results of leave-one-subject-out validation.","PeriodicalId":400983,"journal":{"name":"2019 IEEE 15th International Colloquium on Signal Processing & Its Applications (CSPA)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 15th International Colloquium on Signal Processing & Its Applications (CSPA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSPA.2019.8696054","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14

Abstract

Emotion recognition during music listening using electroencephalogram (EEG) has gained more attention from researchers, recently. Many studies focused on accuracy on one subject while subject-independent performance evaluation was still unclear. In this paper, the objective is to create an emotion recognition model that can be applied to multiple subjects. By adopting convolutional neural networks (CNNs), advantage could be gained from utilizing information from electrodes and time steps. Using CNNs also does not need feature extraction which might leave out other related but unobserved features. CNNs with three to seven convolutional layers were deployed in this research. We measured their performance with a binary classification task for compositions of emotions including arousal and valence. The results showed that our method captured EEG signal patterns from numerous subjects by 10-fold cross validation with 81.54% and 86.87% accuracy from arousal and valence respectively. The method also showed a higher capability of generalization to unseen subjects than the previous method as can be observed from the results of leave-one-subject-out validation.
基于深度卷积神经网络的音乐听歌过程情绪识别
近年来,利用脑电图(EEG)识别音乐聆听过程中的情绪受到了研究人员的广泛关注。许多研究关注的是某一学科的准确性,而独立于学科的绩效评价尚不明确。本文的目标是创建一个可以应用于多主体的情感识别模型。通过采用卷积神经网络(cnn),可以利用电极和时间步长的信息获得优势。使用cnn也不需要特征提取,这可能会遗漏其他相关但未观察到的特征。在本研究中使用了三到七个卷积层的cnn。我们用包括唤醒和效价在内的情绪组成的二元分类任务来测量他们的表现。结果表明,该方法通过10倍交叉验证捕获了大量被试的脑电信号模式,唤醒和效价的准确率分别为81.54%和86.87%。从留一个被试的验证结果可以看出,该方法对未见过的被试具有更高的泛化能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信