Joint low-rank tensor fusion and cross-modal attention for multimodal physiological signals based emotion recognition.

IF 2.3 4区 医学 Q3 BIOPHYSICS
Xin Wan, Yongxiong Wang, Zhe Wang, Yiheng Tang, Benke Liu
{"title":"Joint low-rank tensor fusion and cross-modal attention for multimodal physiological signals based emotion recognition.","authors":"Xin Wan, Yongxiong Wang, Zhe Wang, Yiheng Tang, Benke Liu","doi":"10.1088/1361-6579/ad5bbc","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective</i>. Physiological signals based emotion recognition is a prominent research domain in the field of human-computer interaction. Previous studies predominantly focused on unimodal data, giving limited attention to the interplay among multiple modalities. Within the scope of multimodal emotion recognition, integrating the information from diverse modalities and leveraging the complementary information are the two essential issues to obtain the robust representations.<i>Approach</i>. Thus, we propose a intermediate fusion strategy for combining low-rank tensor fusion with the cross-modal attention to enhance the fusion of electroencephalogram, electrooculogram, electromyography, and galvanic skin response. Firstly, handcrafted features from distinct modalities are individually fed to corresponding feature extractors to obtain latent features. Subsequently, low-rank tensor is fused to integrate the information by the modality interaction representation. Finally, a cross-modal attention module is employed to explore the potential relationships between the distinct latent features and modality interaction representation, and recalibrate the weights of different modalities. And the resultant representation is adopted for emotion recognition.<i>Main results</i>. Furthermore, to validate the effectiveness of the proposed method, we execute subject-independent experiments within the DEAP dataset. The proposed method has achieved the accuracies of 73.82% and 74.55% for valence and arousal classification.<i>Significance</i>. The results of extensive experiments verify the outstanding performance of the proposed method.</p>","PeriodicalId":20047,"journal":{"name":"Physiological measurement","volume":null,"pages":null},"PeriodicalIF":2.3000,"publicationDate":"2024-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physiological measurement","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1088/1361-6579/ad5bbc","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"BIOPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Objective. Physiological signals based emotion recognition is a prominent research domain in the field of human-computer interaction. Previous studies predominantly focused on unimodal data, giving limited attention to the interplay among multiple modalities. Within the scope of multimodal emotion recognition, integrating the information from diverse modalities and leveraging the complementary information are the two essential issues to obtain the robust representations.Approach. Thus, we propose a intermediate fusion strategy for combining low-rank tensor fusion with the cross-modal attention to enhance the fusion of electroencephalogram, electrooculogram, electromyography, and galvanic skin response. Firstly, handcrafted features from distinct modalities are individually fed to corresponding feature extractors to obtain latent features. Subsequently, low-rank tensor is fused to integrate the information by the modality interaction representation. Finally, a cross-modal attention module is employed to explore the potential relationships between the distinct latent features and modality interaction representation, and recalibrate the weights of different modalities. And the resultant representation is adopted for emotion recognition.Main results. Furthermore, to validate the effectiveness of the proposed method, we execute subject-independent experiments within the DEAP dataset. The proposed method has achieved the accuracies of 73.82% and 74.55% for valence and arousal classification.Significance. The results of extensive experiments verify the outstanding performance of the proposed method.

基于多模态生理信号的情绪识别的联合低阶张量融合和跨模态关注。
研究目的基于生理信号的情绪识别是人机交互领域的一个重要研究领域。以往的研究主要集中在单模态数据上,对多种模态之间的相互作用关注有限。在多模态情感识别的范围内,整合来自不同模态的信息和利用互补信息是获得稳健表征的两个基本问题:因此,我们提出了一种将低阶张量融合与跨模态关注相结合的中间融合策略,以增强脑电图(EEG)、脑电图(EOG)、肌电图(EMG)和皮肤电反应(GSR)的融合。首先,将来自不同模态的手工特征分别输入相应的特征提取器,以获得潜在特征。然后,通过模态交互表征融合低秩张量来整合信息。最后,采用跨模态关注模块来探索不同潜特征和模态交互表征之间的潜在关系,并重新校准不同模态的权重。由此产生的表征被用于情绪识别:此外,为了验证所提方法的有效性,我们在 DEAP 数据集中进行了独立于主体的实验。提出的方法在情绪和唤醒分类方面的准确率分别达到了 73.82% 和 74.55%:广泛的实验结果验证了所提方法的卓越性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Physiological measurement
Physiological measurement 生物-工程:生物医学
CiteScore
5.50
自引率
9.40%
发文量
124
审稿时长
3 months
期刊介绍: Physiological Measurement publishes papers about the quantitative assessment and visualization of physiological function in clinical research and practice, with an emphasis on the development of new methods of measurement and their validation. Papers are published on topics including: applied physiology in illness and health electrical bioimpedance, optical and acoustic measurement techniques advanced methods of time series and other data analysis biomedical and clinical engineering in-patient and ambulatory monitoring point-of-care technologies novel clinical measurements of cardiovascular, neurological, and musculoskeletal systems. measurements in molecular, cellular and organ physiology and electrophysiology physiological modeling and simulation novel biomedical sensors, instruments, devices and systems measurement standards and guidelines.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信