BiLSTM-Based Human Emotion Classification Using EEG Signal.

IF 1.7
Akhilesh Kumar, Awadhesh Kumar
{"title":"BiLSTM-Based Human Emotion Classification Using EEG Signal.","authors":"Akhilesh Kumar, Awadhesh Kumar","doi":"10.1177/15500594251364017","DOIUrl":null,"url":null,"abstract":"<p><p>Emotion recognition using electroencephalography (EEG) signals has garnered significant attention due to its applications in affective computing, human-computer interaction, and healthcare. This study employs a Bidirectional Long Short-Term Memory (BiLSTM) network to classify emotions using EEG data from four well-established datasets: SEED, SEED-IV, SEED-V, and DEAP. By leveraging the temporal dependencies inherent in EEG signals, the BiLSTM model demonstrates robust learning of emotional states. The model achieved notable classification accuracies, with 92.30% for SEED, 99.98% for SEED-IV, 99.97% for SEED-V, and 88.33% for DEAP, showcasing its effectiveness across datasets with varying class distributions. The superior performance on SEED-IV and SEED-V underscores the BiLSTM's capability to capture bidirectional temporal information, which is crucial for emotion recognition tasks. Moreover, this work highlights the importance of utilizing diverse datasets to validate the generalizability of EEG-based emotion recognition models. The integration of both dimensional and discrete emotion models in the study demonstrates the framework's flexibility in addressing various emotion representation paradigms. Future directions include optimizing the framework for real-world applications, such as wearable EEG devices, and exploring transfer learning techniques to enhance cross-subject and cross-cultural adaptability. Overall, this study advances EEG-based emotion recognition methodologies, establishing a robust foundation for integrating affective computing into various domains and paving the way for real-time, reliable emotion recognition systems.</p>","PeriodicalId":93940,"journal":{"name":"Clinical EEG and neuroscience","volume":" ","pages":"15500594251364017"},"PeriodicalIF":1.7000,"publicationDate":"2025-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Clinical EEG and neuroscience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/15500594251364017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Emotion recognition using electroencephalography (EEG) signals has garnered significant attention due to its applications in affective computing, human-computer interaction, and healthcare. This study employs a Bidirectional Long Short-Term Memory (BiLSTM) network to classify emotions using EEG data from four well-established datasets: SEED, SEED-IV, SEED-V, and DEAP. By leveraging the temporal dependencies inherent in EEG signals, the BiLSTM model demonstrates robust learning of emotional states. The model achieved notable classification accuracies, with 92.30% for SEED, 99.98% for SEED-IV, 99.97% for SEED-V, and 88.33% for DEAP, showcasing its effectiveness across datasets with varying class distributions. The superior performance on SEED-IV and SEED-V underscores the BiLSTM's capability to capture bidirectional temporal information, which is crucial for emotion recognition tasks. Moreover, this work highlights the importance of utilizing diverse datasets to validate the generalizability of EEG-based emotion recognition models. The integration of both dimensional and discrete emotion models in the study demonstrates the framework's flexibility in addressing various emotion representation paradigms. Future directions include optimizing the framework for real-world applications, such as wearable EEG devices, and exploring transfer learning techniques to enhance cross-subject and cross-cultural adaptability. Overall, this study advances EEG-based emotion recognition methodologies, establishing a robust foundation for integrating affective computing into various domains and paving the way for real-time, reliable emotion recognition systems.

基于bilstm的脑电信号情感分类。
利用脑电图(EEG)信号进行情绪识别由于其在情感计算、人机交互和医疗保健方面的应用而引起了广泛的关注。本研究采用双向长短期记忆(BiLSTM)网络,对来自SEED、SEED- iv、SEED- v和DEAP四个已建立的数据集的脑电图数据进行情绪分类。通过利用脑电图信号固有的时间依赖性,BiLSTM模型显示了对情绪状态的鲁棒学习。该模型取得了显著的分类准确率,SEED为92.30%,SEED- iv为99.98%,SEED- v为99.97%,DEAP为88.33%,显示了其在不同类别分布的数据集上的有效性。在SEED-IV和SEED-V上的优异表现强调了BiLSTM捕获双向时间信息的能力,这对于情绪识别任务至关重要。此外,这项工作强调了利用不同的数据集来验证基于脑电图的情感识别模型的泛化性的重要性。在本研究中,维度和离散情绪模型的整合表明了该框架在处理各种情绪表征范式方面的灵活性。未来的方向包括优化现实世界应用的框架,如可穿戴脑电图设备,以及探索迁移学习技术以增强跨学科和跨文化适应性。总的来说,本研究推进了基于脑电图的情感识别方法,为将情感计算集成到各个领域奠定了坚实的基础,并为实时、可靠的情感识别系统铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信