Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition.

IF 2.4 3区 医学 Q3 NEUROSCIENCES
Frontiers in Human Neuroscience Pub Date : 2024-12-17 eCollection Date: 2024-01-01 DOI:10.3389/fnhum.2024.1471634
Wei Lu, Xiaobo Zhang, Lingnan Xia, Hua Ma, Tien-Ping Tan
{"title":"Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition.","authors":"Wei Lu, Xiaobo Zhang, Lingnan Xia, Hua Ma, Tien-Ping Tan","doi":"10.3389/fnhum.2024.1471634","DOIUrl":null,"url":null,"abstract":"<p><p>Emotion recognition is a critical research topic within affective computing, with potential applications across various domains. Currently, EEG-based emotion recognition, utilizing deep learning frameworks, has been effectively applied and achieved commendable performance. However, existing deep learning-based models face challenges in capturing both the spatial activity features and spatial topology features of EEG signals simultaneously. To address this challenge, a <b>d</b>omain-adaptation <b>s</b>patial-feature <b>p</b>erception-network has been proposed for cross-subject EEG emotion recognition tasks, named DSP-EmotionNet. Firstly, a <b>s</b>patial <b>a</b>ctivity <b>t</b>opological <b>f</b>eature <b>e</b>xtractor <b>m</b>odule has been designed to capture spatial activity features and spatial topology features of EEG signals, named SATFEM. Then, using SATFEM as the feature extractor, DSP-EmotionNet has been designed, significantly improving the accuracy of the model in cross-subject EEG emotion recognition tasks. The proposed model surpasses state-of-the-art methods in cross-subject EEG emotion recognition tasks, achieving an average recognition accuracy of 82.5% on the SEED dataset and 65.9% on the SEED-IV dataset.</p>","PeriodicalId":12536,"journal":{"name":"Frontiers in Human Neuroscience","volume":"18 ","pages":"1471634"},"PeriodicalIF":2.4000,"publicationDate":"2024-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11685119/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Human Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fnhum.2024.1471634","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Emotion recognition is a critical research topic within affective computing, with potential applications across various domains. Currently, EEG-based emotion recognition, utilizing deep learning frameworks, has been effectively applied and achieved commendable performance. However, existing deep learning-based models face challenges in capturing both the spatial activity features and spatial topology features of EEG signals simultaneously. To address this challenge, a domain-adaptation spatial-feature perception-network has been proposed for cross-subject EEG emotion recognition tasks, named DSP-EmotionNet. Firstly, a spatial activity topological feature extractor module has been designed to capture spatial activity features and spatial topology features of EEG signals, named SATFEM. Then, using SATFEM as the feature extractor, DSP-EmotionNet has been designed, significantly improving the accuracy of the model in cross-subject EEG emotion recognition tasks. The proposed model surpasses state-of-the-art methods in cross-subject EEG emotion recognition tasks, achieving an average recognition accuracy of 82.5% on the SEED dataset and 65.9% on the SEED-IV dataset.

领域自适应空间特征感知神经网络在跨主体脑电情感识别中的应用。
情感识别是情感计算中的一个重要研究课题,在各个领域都有潜在的应用。目前,基于脑电图的情感识别利用深度学习框架得到了有效的应用,并取得了良好的效果。然而,现有的基于深度学习的模型在同时捕获脑电信号的空间活动特征和空间拓扑特征方面面临挑战。为了解决这一挑战,提出了一个用于跨主题EEG情绪识别任务的领域自适应空间特征感知网络,命名为DSP-EmotionNet。首先,设计了一个空间活动拓扑特征提取模块,用于捕获脑电信号的空间活动特征和空间拓扑特征,称为SATFEM;然后,利用SATFEM作为特征提取器,设计了DSP-EmotionNet,显著提高了该模型在跨主体脑电情绪识别任务中的准确率。该模型在跨主题EEG情绪识别任务中超越了最先进的方法,在SEED数据集上实现了82.5%的平均识别准确率,在SEED- iv数据集上实现了65.9%的平均识别准确率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Frontiers in Human Neuroscience
Frontiers in Human Neuroscience 医学-神经科学
CiteScore
4.70
自引率
6.90%
发文量
830
审稿时长
2-4 weeks
期刊介绍: Frontiers in Human Neuroscience is a first-tier electronic journal devoted to understanding the brain mechanisms supporting cognitive and social behavior in humans, and how these mechanisms might be altered in disease states. The last 25 years have seen an explosive growth in both the methods and the theoretical constructs available to study the human brain. Advances in electrophysiological, neuroimaging, neuropsychological, psychophysical, neuropharmacological and computational approaches have provided key insights into the mechanisms of a broad range of human behaviors in both health and disease. Work in human neuroscience ranges from the cognitive domain, including areas such as memory, attention, language and perception to the social domain, with this last subject addressing topics, such as interpersonal interactions, social discourse and emotional regulation. How these processes unfold during development, mature in adulthood and often decline in aging, and how they are altered in a host of developmental, neurological and psychiatric disorders, has become increasingly amenable to human neuroscience research approaches. Work in human neuroscience has influenced many areas of inquiry ranging from social and cognitive psychology to economics, law and public policy. Accordingly, our journal will provide a forum for human research spanning all areas of human cognitive, social, developmental and translational neuroscience using any research approach.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信