MEFD dataset and GCSFormer model : Cross-subject emotion recognition based on multimodal physiological signals.

IF 1.6 Q3 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
Xiangyu Deng, Zhecong Fan, Wenbo Dong
{"title":"MEFD dataset and GCSFormer model : Cross-subject emotion recognition based on multimodal physiological signals.","authors":"Xiangyu Deng, Zhecong Fan, Wenbo Dong","doi":"10.1088/2057-1976/ae0e28","DOIUrl":null,"url":null,"abstract":"<p><p>Cross-subject emotion recognition is an important research direction in the fields of affective computing and brain-computer interfaces, aiming to identify the emotional states of different individuals through physiological signals such as functional near-infrared spectroscopy (fNIRS) and electroencephalogram (EEG). Currently, most EEG-based emotion recognition datasets are unimodal or bimodal, which may overlook the emotional information reflected by other physiological signals of the subjects. In this paper, a multimodal dataset named Multimodal Emotion Four Category Dataset (MEFD) is constructed, which includes EEG, Heart Rate Variability (HRV), Electrooculogram (EOG), and Electrodermal Activity (EDA) data from 34 participants in four emotional states: sadness, happiness, fear, and calm. This will contribute to the development of multimodal emotion recognition research. To address the recognition difficulty caused by individual differences in cross-subject emotion recognition tasks, a classification model named Global Convolution Shifted Window Transformer (GCSFormer) composed of an EEG-Swin Convolution module and an improved Global Adaptive Transformer (GAT) module is proposed. By using a parallel network, the feature discrimination ability and generalization ability are enhanced. The model is applied to classify the EEG data in the self-built MEFD dataset, and the results are compared with those of mainstream methods. The experimental results show that the proposed EEG classification method achieves the best average accuracy of 85.36%, precision of 85.23%, recall of 86.35%, and F1 score of 84.52% in the cross-subject emotion recognition task. The excellent performance of GCSFormer in cross-subject emotion recognition task was verifie.</p>","PeriodicalId":8896,"journal":{"name":"Biomedical Physics & Engineering Express","volume":" ","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomedical Physics & Engineering Express","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/2057-1976/ae0e28","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0

Abstract

Cross-subject emotion recognition is an important research direction in the fields of affective computing and brain-computer interfaces, aiming to identify the emotional states of different individuals through physiological signals such as functional near-infrared spectroscopy (fNIRS) and electroencephalogram (EEG). Currently, most EEG-based emotion recognition datasets are unimodal or bimodal, which may overlook the emotional information reflected by other physiological signals of the subjects. In this paper, a multimodal dataset named Multimodal Emotion Four Category Dataset (MEFD) is constructed, which includes EEG, Heart Rate Variability (HRV), Electrooculogram (EOG), and Electrodermal Activity (EDA) data from 34 participants in four emotional states: sadness, happiness, fear, and calm. This will contribute to the development of multimodal emotion recognition research. To address the recognition difficulty caused by individual differences in cross-subject emotion recognition tasks, a classification model named Global Convolution Shifted Window Transformer (GCSFormer) composed of an EEG-Swin Convolution module and an improved Global Adaptive Transformer (GAT) module is proposed. By using a parallel network, the feature discrimination ability and generalization ability are enhanced. The model is applied to classify the EEG data in the self-built MEFD dataset, and the results are compared with those of mainstream methods. The experimental results show that the proposed EEG classification method achieves the best average accuracy of 85.36%, precision of 85.23%, recall of 86.35%, and F1 score of 84.52% in the cross-subject emotion recognition task. The excellent performance of GCSFormer in cross-subject emotion recognition task was verifie.

MEFD数据集和GCSFormer模型:基于多模态生理信号的跨主体情绪识别。
跨主体情绪识别是情感计算和脑机接口领域的一个重要研究方向,旨在通过功能近红外光谱(fNIRS)和脑电图(EEG)等生理信号识别不同个体的情绪状态。目前,大多数基于脑电图的情绪识别数据集都是单峰或双峰的,这可能会忽略被试其他生理信号所反映的情绪信息。本文构建了多模态情绪四类数据集(MEFD),该数据集包括34名参与者在悲伤、快乐、恐惧和平静四种情绪状态下的脑电图(EEG)、心率变异性(HRV)、眼电图(EOG)和皮电活动(EDA)数据。这将有助于多模态情绪识别研究的发展。为了解决跨主体情感识别任务中个体差异带来的识别困难,提出了一种由EEG-Swin卷积模块和改进的全局自适应变压器模块组成的全局卷积移位窗口变压器(GCSFormer)分类模型。通过使用并行网络,增强了特征识别能力和泛化能力。将该模型应用于自建MEFD数据集的脑电数据分类,并与主流方法进行了比较。实验结果表明,本文提出的脑电分类方法在跨主体情感识别任务中平均正确率为85.36%,精密度为85.23%,查全率为86.35%,F1得分为84.52%。验证了GCSFormer在跨主体情感识别任务中的优异表现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Biomedical Physics & Engineering Express
Biomedical Physics & Engineering Express RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING-
CiteScore
2.80
自引率
0.00%
发文量
153
期刊介绍: BPEX is an inclusive, international, multidisciplinary journal devoted to publishing new research on any application of physics and/or engineering in medicine and/or biology. Characterized by a broad geographical coverage and a fast-track peer-review process, relevant topics include all aspects of biophysics, medical physics and biomedical engineering. Papers that are almost entirely clinical or biological in their focus are not suitable. The journal has an emphasis on publishing interdisciplinary work and bringing research fields together, encompassing experimental, theoretical and computational work.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信