基于频带注意图卷积对抗神经网络的脑机接口中的跨主体情感识别。

IF 2.7 4区 医学 Q2 BIOCHEMICAL RESEARCH METHODS
Shinan Chen , Yuchen Wang , Xuefen Lin , Xiaoyong Sun , Weihua Li , Weifeng Ma
{"title":"基于频带注意图卷积对抗神经网络的脑机接口中的跨主体情感识别。","authors":"Shinan Chen ,&nbsp;Yuchen Wang ,&nbsp;Xuefen Lin ,&nbsp;Xiaoyong Sun ,&nbsp;Weihua Li ,&nbsp;Weifeng Ma","doi":"10.1016/j.jneumeth.2024.110276","DOIUrl":null,"url":null,"abstract":"<div><h3><em>Background:</em></h3><p>Emotion is an important area in neuroscience. Cross-subject emotion recognition based on electroencephalogram (EEG) data is challenging due to physiological differences between subjects. Domain gap, which refers to the different distributions of EEG data at different subjects, has attracted great attention for cross-subject emotion recognition.</p></div><div><h3><em>Comparison with existing methods:</em></h3><p>This study focuses on narrowing the domain gap between subjects through the emotional frequency bands and the relationship information between EEG channels. Emotional frequency band features represent the energy distribution of EEG data in different frequency ranges, while relationship information between EEG channels provides spatial distribution information about EEG data.</p></div><div><h3><em>New method:</em></h3><p>To achieve this, this paper proposes a model called the Frequency Band Attention Graph convolutional Adversarial neural Network (FBAGAN). This model includes three components: a feature extractor, a classifier, and a discriminator. The feature extractor consists of a layer with a frequency band attention mechanism and a graph convolutional neural network. The mechanism effectively extracts frequency band information by assigning weights and Graph Convolutional Networks can extract relationship information between EEG channels by modeling the graph structure. The discriminator then helps minimize the gap in the frequency information and relationship information between the source and target domains, improving the model’s ability to generalize.</p></div><div><h3><em>Results:</em></h3><p>The FBAGAN model is extensively tested on the SEED, SEED-IV, and DEAP datasets. The accuracy and standard deviation scores are 88.17% and 4.88, respectively, on the SEED dataset, and 77.35% and 3.72 on the SEED-IV dataset. On the DEAP dataset, the model achieves 69.64% for Arousal and 65.18% for Valence. These results outperform most existing models.</p></div><div><h3><em>Conclusions:</em></h3><p>The experiments indicate that FBAGAN effectively addresses the challenges of transferring EEG channel domain and frequency band domain, leading to improved performance.</p></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":"411 ","pages":"Article 110276"},"PeriodicalIF":2.7000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Cross-subject emotion recognition in brain-computer interface based on frequency band attention graph convolutional adversarial neural networks\",\"authors\":\"Shinan Chen ,&nbsp;Yuchen Wang ,&nbsp;Xuefen Lin ,&nbsp;Xiaoyong Sun ,&nbsp;Weihua Li ,&nbsp;Weifeng Ma\",\"doi\":\"10.1016/j.jneumeth.2024.110276\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3><em>Background:</em></h3><p>Emotion is an important area in neuroscience. Cross-subject emotion recognition based on electroencephalogram (EEG) data is challenging due to physiological differences between subjects. Domain gap, which refers to the different distributions of EEG data at different subjects, has attracted great attention for cross-subject emotion recognition.</p></div><div><h3><em>Comparison with existing methods:</em></h3><p>This study focuses on narrowing the domain gap between subjects through the emotional frequency bands and the relationship information between EEG channels. Emotional frequency band features represent the energy distribution of EEG data in different frequency ranges, while relationship information between EEG channels provides spatial distribution information about EEG data.</p></div><div><h3><em>New method:</em></h3><p>To achieve this, this paper proposes a model called the Frequency Band Attention Graph convolutional Adversarial neural Network (FBAGAN). This model includes three components: a feature extractor, a classifier, and a discriminator. The feature extractor consists of a layer with a frequency band attention mechanism and a graph convolutional neural network. The mechanism effectively extracts frequency band information by assigning weights and Graph Convolutional Networks can extract relationship information between EEG channels by modeling the graph structure. The discriminator then helps minimize the gap in the frequency information and relationship information between the source and target domains, improving the model’s ability to generalize.</p></div><div><h3><em>Results:</em></h3><p>The FBAGAN model is extensively tested on the SEED, SEED-IV, and DEAP datasets. The accuracy and standard deviation scores are 88.17% and 4.88, respectively, on the SEED dataset, and 77.35% and 3.72 on the SEED-IV dataset. On the DEAP dataset, the model achieves 69.64% for Arousal and 65.18% for Valence. These results outperform most existing models.</p></div><div><h3><em>Conclusions:</em></h3><p>The experiments indicate that FBAGAN effectively addresses the challenges of transferring EEG channel domain and frequency band domain, leading to improved performance.</p></div>\",\"PeriodicalId\":16415,\"journal\":{\"name\":\"Journal of Neuroscience Methods\",\"volume\":\"411 \",\"pages\":\"Article 110276\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-09-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Neuroscience Methods\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0165027024002218\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"BIOCHEMICAL RESEARCH METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Neuroscience Methods","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165027024002218","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0

摘要

背景:情绪是神经科学的一个重要领域。由于受试者之间的生理差异,基于脑电图(EEG)数据的跨受试者情感识别具有挑战性。域差距(domain gap)是指不同受试者脑电图数据的不同分布,它在跨受试者情感识别中引起了极大的关注:本研究主要通过情感频段和脑电图通道之间的关系信息来缩小主体间的域差距。情绪频段特征代表了脑电图数据在不同频率范围内的能量分布,而脑电图通道之间的关系信息则提供了脑电图数据的空间分布信息:为此,本文提出了一种名为频带注意图卷积对抗神经网络(FBAGAN)的模型。该模型包括三个部分:特征提取器、分类器和判别器。特征提取器由一个带有频带注意机制的层和一个图卷积神经网络组成。该机制通过分配权重有效提取频段信息,而图卷积神经网络可通过图结构建模提取脑电图通道之间的关系信息。然后,判别器有助于最小化源域和目标域之间频率信息和关系信息的差距,从而提高模型的泛化能力:在 SEED、SEED-IV 和 DEAP 数据集上对 FBAGAN 模型进行了广泛测试。在 SEED 数据集上的准确率和标准偏差分别为 88.17% 和 4.88,在 SEED-IV 数据集上的准确率和标准偏差分别为 77.35% 和 3.72。在 DEAP 数据集上,该模型的 "唤醒 "得分率为 69.64%,"情感 "得分率为 65.18%。这些结果优于大多数现有模型:实验表明,FBAGAN 有效地解决了脑电图信道域和频带域传输的难题,从而提高了性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Cross-subject emotion recognition in brain-computer interface based on frequency band attention graph convolutional adversarial neural networks

Background:

Emotion is an important area in neuroscience. Cross-subject emotion recognition based on electroencephalogram (EEG) data is challenging due to physiological differences between subjects. Domain gap, which refers to the different distributions of EEG data at different subjects, has attracted great attention for cross-subject emotion recognition.

Comparison with existing methods:

This study focuses on narrowing the domain gap between subjects through the emotional frequency bands and the relationship information between EEG channels. Emotional frequency band features represent the energy distribution of EEG data in different frequency ranges, while relationship information between EEG channels provides spatial distribution information about EEG data.

New method:

To achieve this, this paper proposes a model called the Frequency Band Attention Graph convolutional Adversarial neural Network (FBAGAN). This model includes three components: a feature extractor, a classifier, and a discriminator. The feature extractor consists of a layer with a frequency band attention mechanism and a graph convolutional neural network. The mechanism effectively extracts frequency band information by assigning weights and Graph Convolutional Networks can extract relationship information between EEG channels by modeling the graph structure. The discriminator then helps minimize the gap in the frequency information and relationship information between the source and target domains, improving the model’s ability to generalize.

Results:

The FBAGAN model is extensively tested on the SEED, SEED-IV, and DEAP datasets. The accuracy and standard deviation scores are 88.17% and 4.88, respectively, on the SEED dataset, and 77.35% and 3.72 on the SEED-IV dataset. On the DEAP dataset, the model achieves 69.64% for Arousal and 65.18% for Valence. These results outperform most existing models.

Conclusions:

The experiments indicate that FBAGAN effectively addresses the challenges of transferring EEG channel domain and frequency band domain, leading to improved performance.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Neuroscience Methods
Journal of Neuroscience Methods 医学-神经科学
CiteScore
7.10
自引率
3.30%
发文量
226
审稿时长
52 days
期刊介绍: The Journal of Neuroscience Methods publishes papers that describe new methods that are specifically for neuroscience research conducted in invertebrates, vertebrates or in man. Major methodological improvements or important refinements of established neuroscience methods are also considered for publication. The Journal''s Scope includes all aspects of contemporary neuroscience research, including anatomical, behavioural, biochemical, cellular, computational, molecular, invasive and non-invasive imaging, optogenetic, and physiological research investigations.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信