Enhancing emotion recognition through multi-modal data fusion and graph neural networks

Kasthuri Devarajan , Suresh Ponnan , Sundresan Perumal
{"title":"Enhancing emotion recognition through multi-modal data fusion and graph neural networks","authors":"Kasthuri Devarajan ,&nbsp;Suresh Ponnan ,&nbsp;Sundresan Perumal","doi":"10.1016/j.ibmed.2025.100291","DOIUrl":null,"url":null,"abstract":"<div><div>In this paper, a novel emotion detection system is proposed based on Graph Neural Network (GNN) architecture, which is used to integrate and learn from multiple data sets (EEG, face expression, physiological signals). The proposed GNN is able to learn about interactions between multiple modalities, so as to extract a single picture of emotion categorization. This model is very good and gets 91.25 % accuracy, 91.26 % precision, 91.25 % recall and 91.25 % F1-score. Moreover, the proposed GNN is a sensible trade-off between speed and precision, with a calculation time of 163 ms. The Proposed GNN is better, primarily due to its ability to represent complex relations between multi-modal inputs, thereby improving its real-time emotional state recognition and classification performance. The proposed GNN demonstrates its suitability for powerful emotion detection by outperforming all models in classification precision and multi-modal data fusion, surpassing traditional models such as SVM, KNN, CCA, CNN, and RNN. The Proposed GNN consistently proves to be the most accurate and robust solution, having been the most dominant technique in emotion detection, despite CNN and RNN achieving slightly better results.</div></div>","PeriodicalId":73399,"journal":{"name":"Intelligence-based medicine","volume":"12 ","pages":"Article 100291"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligence-based medicine","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S266652122500095X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, a novel emotion detection system is proposed based on Graph Neural Network (GNN) architecture, which is used to integrate and learn from multiple data sets (EEG, face expression, physiological signals). The proposed GNN is able to learn about interactions between multiple modalities, so as to extract a single picture of emotion categorization. This model is very good and gets 91.25 % accuracy, 91.26 % precision, 91.25 % recall and 91.25 % F1-score. Moreover, the proposed GNN is a sensible trade-off between speed and precision, with a calculation time of 163 ms. The Proposed GNN is better, primarily due to its ability to represent complex relations between multi-modal inputs, thereby improving its real-time emotional state recognition and classification performance. The proposed GNN demonstrates its suitability for powerful emotion detection by outperforming all models in classification precision and multi-modal data fusion, surpassing traditional models such as SVM, KNN, CCA, CNN, and RNN. The Proposed GNN consistently proves to be the most accurate and robust solution, having been the most dominant technique in emotion detection, despite CNN and RNN achieving slightly better results.
通过多模态数据融合和图神经网络增强情绪识别
本文提出了一种基于图神经网络(GNN)架构的情绪检测系统,该系统用于对多个数据集(EEG、面部表情、生理信号)进行整合和学习。提出的GNN能够学习多个模态之间的相互作用,从而提取出单一的情绪分类图像。该模型的准确率为91.25%,精密度为91.26%,召回率为91.25%,f1分数为91.25%。此外,所提出的GNN在速度和精度之间进行了合理的权衡,计算时间为163 ms。提出的GNN更好,主要是因为它能够表示多模态输入之间的复杂关系,从而提高了其实时情绪状态识别和分类性能。本文提出的GNN在分类精度和多模态数据融合方面优于所有模型,超越了SVM、KNN、CCA、CNN和RNN等传统模型,证明了它适合于强大的情感检测。尽管CNN和RNN取得了稍好的结果,但所提出的GNN始终被证明是最准确和鲁棒的解决方案,是情绪检测中最主要的技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Intelligence-based medicine
Intelligence-based medicine Health Informatics
CiteScore
5.00
自引率
0.00%
发文量
0
审稿时长
187 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信