利用无创多通道脑电图和混合深度学习架构加强情绪检测

IF 1.5 4区 工程技术 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC
Durgesh Nandini, Jyoti Yadav, Asha Rani, Vijander Singh
{"title":"利用无创多通道脑电图和混合深度学习架构加强情绪检测","authors":"Durgesh Nandini, Jyoti Yadav, Asha Rani, Vijander Singh","doi":"10.1007/s40998-024-00710-4","DOIUrl":null,"url":null,"abstract":"<p>Emotion recognition is vital for augmenting human–computer interactions by integrating emotional contextual information for enhanced communication. Hence, the study presents an intelligent emotion detection system developed utilizing hybrid stacked gated recurrent units (GRU)-recurrent neural network (RNN) deep learning architecture. Integration of GRU with RNN allows the system to make use of both models’ capabilities, making it better at capturing complex emotional patterns and temporal correlations. The EEG signals are investigated in time, frequency, and time–frequency domains, meticulously curated to capture intricate multi-domain patterns. Then, the SMOTE-Tomek method ensures a uniform class distribution, while the PCA technique optimizes features by minimizing data redundancy. A comprehensive experimentation including the well-established emotion datasets: DEAP and AMIGOS, assesses the efficacy of the hybrid stacked GRU and RNN architecture in contrast to 1D convolution neural network, RNN and GRU models. Moreover, the “Hyperopt” technique fine-tunes the model’s hyperparameter, improving the average accuracy by about 3.73%. Hence, results revealed that the hybrid GRU-RNN model demonstrates the most optimal performance with the highest classification accuracies of 99.77% ± 0.13, 99.54% ± 0.16, 99.82% ± 0.14, and 99.68% ± 0.13 for the 3D VAD and liking parameter, respectively. Furthermore, the model’s generalizability is examined using the cross-subject and database analysis on the DEAP and AMIGOS datasets, exhibiting a classification with an average accuracy of about 99.75% ± 0.10 and 99.97% ± 0.03. Obtained results when compared with the existing methods in literature demonstrate superior performance, highlighting potential in emotion recognition.</p>","PeriodicalId":49064,"journal":{"name":"Iranian Journal of Science and Technology-Transactions of Electrical Engineering","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2024-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhancing Emotion Detection with Non-invasive Multi-Channel EEG and Hybrid Deep Learning Architecture\",\"authors\":\"Durgesh Nandini, Jyoti Yadav, Asha Rani, Vijander Singh\",\"doi\":\"10.1007/s40998-024-00710-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Emotion recognition is vital for augmenting human–computer interactions by integrating emotional contextual information for enhanced communication. Hence, the study presents an intelligent emotion detection system developed utilizing hybrid stacked gated recurrent units (GRU)-recurrent neural network (RNN) deep learning architecture. Integration of GRU with RNN allows the system to make use of both models’ capabilities, making it better at capturing complex emotional patterns and temporal correlations. The EEG signals are investigated in time, frequency, and time–frequency domains, meticulously curated to capture intricate multi-domain patterns. Then, the SMOTE-Tomek method ensures a uniform class distribution, while the PCA technique optimizes features by minimizing data redundancy. A comprehensive experimentation including the well-established emotion datasets: DEAP and AMIGOS, assesses the efficacy of the hybrid stacked GRU and RNN architecture in contrast to 1D convolution neural network, RNN and GRU models. Moreover, the “Hyperopt” technique fine-tunes the model’s hyperparameter, improving the average accuracy by about 3.73%. Hence, results revealed that the hybrid GRU-RNN model demonstrates the most optimal performance with the highest classification accuracies of 99.77% ± 0.13, 99.54% ± 0.16, 99.82% ± 0.14, and 99.68% ± 0.13 for the 3D VAD and liking parameter, respectively. Furthermore, the model’s generalizability is examined using the cross-subject and database analysis on the DEAP and AMIGOS datasets, exhibiting a classification with an average accuracy of about 99.75% ± 0.10 and 99.97% ± 0.03. Obtained results when compared with the existing methods in literature demonstrate superior performance, highlighting potential in emotion recognition.</p>\",\"PeriodicalId\":49064,\"journal\":{\"name\":\"Iranian Journal of Science and Technology-Transactions of Electrical Engineering\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2024-03-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Iranian Journal of Science and Technology-Transactions of Electrical Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1007/s40998-024-00710-4\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Iranian Journal of Science and Technology-Transactions of Electrical Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s40998-024-00710-4","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

情感识别对于通过整合情感背景信息增强人机交互至关重要。因此,本研究提出了一种利用混合堆叠门控递归单元(GRU)-递归神经网络(RNN)深度学习架构开发的智能情感检测系统。GRU 与 RNN 的集成使系统能够利用两种模型的能力,从而更好地捕捉复杂的情绪模式和时间相关性。对脑电图信号进行时域、频域和时频域研究,精心策划以捕捉错综复杂的多域模式。然后,SMOTE-Tomek 方法确保了统一的类别分布,而 PCA 技术则通过最大限度地减少数据冗余来优化特征。综合实验包括成熟的情感数据集:与一维卷积神经网络、RNN 和 GRU 模型相比,DEAP 和 AMIGOS 评估了混合堆叠 GRU 和 RNN 架构的功效。此外,"Hyperopt "技术对模型的超参数进行了微调,使平均准确率提高了约 3.73%。因此,结果表明,混合 GRU-RNN 模型的性能最优,对 3D VAD 和 liking 参数的分类准确率最高,分别为 99.77% ± 0.13、99.54% ± 0.16、99.82% ± 0.14 和 99.68% ± 0.13。此外,通过对 DEAP 和 AMIGOS 数据集进行跨主体和数据库分析,检验了模型的普适性,结果表明分类的平均准确率约为 99.75% ± 0.10 和 99.97% ± 0.03。与文献中的现有方法相比,所获得的结果显示出卓越的性能,凸显了情感识别的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Enhancing Emotion Detection with Non-invasive Multi-Channel EEG and Hybrid Deep Learning Architecture

Enhancing Emotion Detection with Non-invasive Multi-Channel EEG and Hybrid Deep Learning Architecture

Emotion recognition is vital for augmenting human–computer interactions by integrating emotional contextual information for enhanced communication. Hence, the study presents an intelligent emotion detection system developed utilizing hybrid stacked gated recurrent units (GRU)-recurrent neural network (RNN) deep learning architecture. Integration of GRU with RNN allows the system to make use of both models’ capabilities, making it better at capturing complex emotional patterns and temporal correlations. The EEG signals are investigated in time, frequency, and time–frequency domains, meticulously curated to capture intricate multi-domain patterns. Then, the SMOTE-Tomek method ensures a uniform class distribution, while the PCA technique optimizes features by minimizing data redundancy. A comprehensive experimentation including the well-established emotion datasets: DEAP and AMIGOS, assesses the efficacy of the hybrid stacked GRU and RNN architecture in contrast to 1D convolution neural network, RNN and GRU models. Moreover, the “Hyperopt” technique fine-tunes the model’s hyperparameter, improving the average accuracy by about 3.73%. Hence, results revealed that the hybrid GRU-RNN model demonstrates the most optimal performance with the highest classification accuracies of 99.77% ± 0.13, 99.54% ± 0.16, 99.82% ± 0.14, and 99.68% ± 0.13 for the 3D VAD and liking parameter, respectively. Furthermore, the model’s generalizability is examined using the cross-subject and database analysis on the DEAP and AMIGOS datasets, exhibiting a classification with an average accuracy of about 99.75% ± 0.10 and 99.97% ± 0.03. Obtained results when compared with the existing methods in literature demonstrate superior performance, highlighting potential in emotion recognition.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
5.50
自引率
4.20%
发文量
93
审稿时长
>12 weeks
期刊介绍: Transactions of Electrical Engineering is to foster the growth of scientific research in all branches of electrical engineering and its related grounds and to provide a medium by means of which the fruits of these researches may be brought to the attentionof the world’s scientific communities. The journal has the focus on the frontier topics in the theoretical, mathematical, numerical, experimental and scientific developments in electrical engineering as well as applications of established techniques to new domains in various electical engineering disciplines such as: Bio electric, Bio mechanics, Bio instrument, Microwaves, Wave Propagation, Communication Theory, Channel Estimation, radar & sonar system, Signal Processing, image processing, Artificial Neural Networks, Data Mining and Machine Learning, Fuzzy Logic and Systems, Fuzzy Control, Optimal & Robust ControlNavigation & Estimation Theory, Power Electronics & Drives, Power Generation & Management The editors will welcome papers from all professors and researchers from universities, research centers, organizations, companies and industries from all over the world in the hope that this will advance the scientific standards of the journal and provide a channel of communication between Iranian Scholars and their colleague in other parts of the world.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信