ST-SCGNN: A Spatio-Temporal Self-Constructing Graph Neural Network for Cross-Subject EEG-Based Emotion Recognition and Consciousness Detection

IF 6.7 2区 医学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Jiahui Pan;Rongming Liang;Zhipeng He;Jingcong Li;Yan Liang;Xinjie Zhou;Yanbin He;Yuanqing Li
{"title":"ST-SCGNN: A Spatio-Temporal Self-Constructing Graph Neural Network for Cross-Subject EEG-Based Emotion Recognition and Consciousness Detection","authors":"Jiahui Pan;Rongming Liang;Zhipeng He;Jingcong Li;Yan Liang;Xinjie Zhou;Yanbin He;Yuanqing Li","doi":"10.1109/JBHI.2023.3335854","DOIUrl":null,"url":null,"abstract":"In this paper, a novel spatio-temporal self-constructing graph neural network (ST-SCGNN) is proposed for cross-subject emotion recognition and consciousness detection. For spatio-temporal feature generation, activation and connection pattern features are first extracted and then combined to leverage their complementary emotion-related information. Next, a self-constructing graph neural network with a spatio-temporal model is presented. Specifically, the graph structure of the neural network is dynamically updated by the self-constructing module of the input signal. Experiments based on the SEED and SEED-IV datasets showed that the model achieved average accuracies of 85.90% and 76.37%, respectively. Both values exceed the state-of-the-art metrics with the same protocol. In clinical besides, patients with disorders of consciousness (DOC) suffer severe brain injuries, and sufficient training data for EEG-based emotion recognition cannot be collected. Our proposed ST-SCGNN method for cross-subject emotion recognition was first attempted in training in ten healthy subjects and testing in eight patients with DOC. We found that two patients obtained accuracies significantly higher than chance level and showed similar neural patterns with healthy subjects. Covert consciousness and emotion-related abilities were thus demonstrated in these two patients. Our proposed ST-SCGNN for cross-subject emotion recognition could be a promising tool for consciousness detection in DOC patients.","PeriodicalId":13073,"journal":{"name":"IEEE Journal of Biomedical and Health Informatics","volume":"28 2","pages":"777-788"},"PeriodicalIF":6.7000,"publicationDate":"2023-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Biomedical and Health Informatics","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10329957/","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, a novel spatio-temporal self-constructing graph neural network (ST-SCGNN) is proposed for cross-subject emotion recognition and consciousness detection. For spatio-temporal feature generation, activation and connection pattern features are first extracted and then combined to leverage their complementary emotion-related information. Next, a self-constructing graph neural network with a spatio-temporal model is presented. Specifically, the graph structure of the neural network is dynamically updated by the self-constructing module of the input signal. Experiments based on the SEED and SEED-IV datasets showed that the model achieved average accuracies of 85.90% and 76.37%, respectively. Both values exceed the state-of-the-art metrics with the same protocol. In clinical besides, patients with disorders of consciousness (DOC) suffer severe brain injuries, and sufficient training data for EEG-based emotion recognition cannot be collected. Our proposed ST-SCGNN method for cross-subject emotion recognition was first attempted in training in ten healthy subjects and testing in eight patients with DOC. We found that two patients obtained accuracies significantly higher than chance level and showed similar neural patterns with healthy subjects. Covert consciousness and emotion-related abilities were thus demonstrated in these two patients. Our proposed ST-SCGNN for cross-subject emotion recognition could be a promising tool for consciousness detection in DOC patients.
ST-SCGNN:用于跨主体脑电图情感识别和意识检测的时空自构建图神经网络。
本文提出了一种用于跨主体情感识别和意识检测的时空自构建图神经网络(ST-SCGNN)。对于时空特征的生成,首先提取激活模式和连接模式特征,然后将它们结合起来,利用它们互补的情感相关信息。其次,提出了一种具有时空模型的自构造图神经网络。具体来说,通过输入信号的自构造模块动态更新神经网络的图结构。基于SEED和SEED- iv数据集的实验表明,该模型的平均准确率分别为85.90%和76.37%。这两个值都超过了使用相同协议的最新指标。此外,在临床中,意识障碍(DOC)患者脑损伤严重,无法收集到足够的基于脑电图的情绪识别训练数据。我们提出的ST-SCGNN方法首先在10名健康受试者和8名DOC患者中进行了训练和测试。我们发现两名患者获得的准确率显著高于偶然水平,并表现出与健康受试者相似的神经模式。隐性意识和情感相关能力在这两个病人身上得到了体现。我们提出的ST-SCGNN跨主体情绪识别可能是一个有前途的工具,意识检测的DOC患者。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Journal of Biomedical and Health Informatics
IEEE Journal of Biomedical and Health Informatics COMPUTER SCIENCE, INFORMATION SYSTEMS-COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
CiteScore
13.60
自引率
6.50%
发文量
1151
期刊介绍: IEEE Journal of Biomedical and Health Informatics publishes original papers presenting recent advances where information and communication technologies intersect with health, healthcare, life sciences, and biomedicine. Topics include acquisition, transmission, storage, retrieval, management, and analysis of biomedical and health information. The journal covers applications of information technologies in healthcare, patient monitoring, preventive care, early disease diagnosis, therapy discovery, and personalized treatment protocols. It explores electronic medical and health records, clinical information systems, decision support systems, medical and biological imaging informatics, wearable systems, body area/sensor networks, and more. Integration-related topics like interoperability, evidence-based medicine, and secure patient data are also addressed.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信