EEG emotion recognition based on dynamic temporal causal graph convolutional network

IF 7.6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yaru Zhou , Xueying Zhang , Ying Sun , Guijun Chen , Lixia Huang , Haifeng Li
{"title":"EEG emotion recognition based on dynamic temporal causal graph convolutional network","authors":"Yaru Zhou ,&nbsp;Xueying Zhang ,&nbsp;Ying Sun ,&nbsp;Guijun Chen ,&nbsp;Lixia Huang ,&nbsp;Haifeng Li","doi":"10.1016/j.knosys.2025.113752","DOIUrl":null,"url":null,"abstract":"<div><div>Inspired by the connectivity characteristics of brain networks, the dynamic evolution in the connectivity relationships between different brain regions throughout time provides information on emotional representation. However, current electroencephalographic (EEG) emotion recognition methods often overlook local density, global sparsity, and temporal causality that are inherent in the connectivity relationships between brain regions. To address these issues, a dynamic temporal–causal graph convolutional network (DTC-GCN) for EEG emotion recognition is proposed herein. The DTC-GCN learns the spatial topology and temporal–causal relationships between EEG channels over a period of time. It takes time-series graphs as the input and is implemented in two stages. In the first stage, the sparse connected dynamic graph convolutional network is used to dynamically learn a locally dense and global sparse brain network. In the second stage, the temporal–causal module is used to construct causal connectivity among EEG channels across different segments. The effectiveness of the proposed model is evaluated by conducting extensive experiments on two publicly available datasets, DEAP and SEED. On the DEAP dataset, the average accuracies of arousal and valence are 95.08% and 94.31%, respectively. On the SEED dataset, the average accuracy is 98.48%. Results indicate that the DTC-GCN outperforms existing state-of-the-art methods. By analyzing the parameters of the DTC-GCN and conducting an interpretability study, we reveal the overall connectivity pattern between EEG channels and the causal relationships between segments within short time intervals.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"323 ","pages":"Article 113752"},"PeriodicalIF":7.6000,"publicationDate":"2025-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125007981","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Inspired by the connectivity characteristics of brain networks, the dynamic evolution in the connectivity relationships between different brain regions throughout time provides information on emotional representation. However, current electroencephalographic (EEG) emotion recognition methods often overlook local density, global sparsity, and temporal causality that are inherent in the connectivity relationships between brain regions. To address these issues, a dynamic temporal–causal graph convolutional network (DTC-GCN) for EEG emotion recognition is proposed herein. The DTC-GCN learns the spatial topology and temporal–causal relationships between EEG channels over a period of time. It takes time-series graphs as the input and is implemented in two stages. In the first stage, the sparse connected dynamic graph convolutional network is used to dynamically learn a locally dense and global sparse brain network. In the second stage, the temporal–causal module is used to construct causal connectivity among EEG channels across different segments. The effectiveness of the proposed model is evaluated by conducting extensive experiments on two publicly available datasets, DEAP and SEED. On the DEAP dataset, the average accuracies of arousal and valence are 95.08% and 94.31%, respectively. On the SEED dataset, the average accuracy is 98.48%. Results indicate that the DTC-GCN outperforms existing state-of-the-art methods. By analyzing the parameters of the DTC-GCN and conducting an interpretability study, we reveal the overall connectivity pattern between EEG channels and the causal relationships between segments within short time intervals.
基于动态时序因果图卷积网络的脑电情绪识别
受大脑网络连通性特征的启发,大脑不同区域之间的连接关系随时间的动态演变为情绪表征提供了信息。然而,目前的脑电图(EEG)情绪识别方法往往忽略了脑区域之间连接关系中固有的局部密度、全局稀疏性和时间因果性。为了解决这些问题,本文提出了一种用于脑电情感识别的动态时间-因果图卷积网络(DTC-GCN)。DTC-GCN在一段时间内学习脑电信号通道之间的空间拓扑和时间因果关系。它以时间序列图作为输入,分两个阶段实现。第一阶段,利用稀疏连通动态图卷积网络动态学习局部密集全局稀疏脑网络;第二阶段,利用时间因果模块构建不同脑电信号通道间的因果连通性。通过在两个公开可用的数据集(DEAP和SEED)上进行大量实验,评估了所提出模型的有效性。在DEAP数据集上,唤醒和效价的平均准确率分别为95.08%和94.31%。在SEED数据集上,平均准确率为98.48%。结果表明,DTC-GCN优于现有的最先进的方法。通过对DTC-GCN参数的分析和可解释性研究,揭示了脑电信号通道之间的整体连接模式和短时间间隔内脑段之间的因果关系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信