{"title":"Emotion recognition based on EEG source signals and dynamic brain function network","authors":"He Sun , Hailing Wang , Raofen Wang , Yufei Gao","doi":"10.1016/j.jneumeth.2024.110358","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Brain network features contain more emotion-related information and can be more effective in emotion recognition. However, emotions change continuously and dynamically, and current function brain network features using the sliding window method cannot explore dynamic characteristics of different emotions, which leads to the serious loss of functional connectivity information.</div></div><div><h3>New method</h3><div>In the study, we proposed a new framework based on EEG source signals and dynamic function brain network (dyFBN) for emotion recognition. We constructed emotion-related dyFBN with dynamic phase linearity measurement (dyPLM) at every time point and extracted the second-order feature Root Mean Square (RMS) based on of dyFBN. In addition, a multiple feature fusion strategy was employed, integrating sensor frequency features with connection information.</div></div><div><h3>Results</h3><div>The recognition accuracy of subject-independent and subject-dependent is 83.50 % and 88.93 %, respectively. The selected optimal feature subset of fused features highlighted the interplay between dynamic features and sensor features and showcased the crucial brain regions of the right superiortemporal, left isthmuscingulate, and left parsorbitalis in emotion recognition.</div></div><div><h3>Comparison with existing methods</h3><div>Compared with current methods, the emotion recognition accuracy of subject-independent and subject-dependent is improved by 11.46 % and 10.19 %, respectively. In addition, recognition accuracy of the fused features of RMS and sensor features is also better than the fused features of existing methods.</div></div><div><h3>Conclusions</h3><div>These findings prove the validity of the proposed framework, which leads to better emotion recognition.</div></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":"415 ","pages":"Article 110358"},"PeriodicalIF":2.7000,"publicationDate":"2024-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Neuroscience Methods","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165027024003030","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Brain network features contain more emotion-related information and can be more effective in emotion recognition. However, emotions change continuously and dynamically, and current function brain network features using the sliding window method cannot explore dynamic characteristics of different emotions, which leads to the serious loss of functional connectivity information.
New method
In the study, we proposed a new framework based on EEG source signals and dynamic function brain network (dyFBN) for emotion recognition. We constructed emotion-related dyFBN with dynamic phase linearity measurement (dyPLM) at every time point and extracted the second-order feature Root Mean Square (RMS) based on of dyFBN. In addition, a multiple feature fusion strategy was employed, integrating sensor frequency features with connection information.
Results
The recognition accuracy of subject-independent and subject-dependent is 83.50 % and 88.93 %, respectively. The selected optimal feature subset of fused features highlighted the interplay between dynamic features and sensor features and showcased the crucial brain regions of the right superiortemporal, left isthmuscingulate, and left parsorbitalis in emotion recognition.
Comparison with existing methods
Compared with current methods, the emotion recognition accuracy of subject-independent and subject-dependent is improved by 11.46 % and 10.19 %, respectively. In addition, recognition accuracy of the fused features of RMS and sensor features is also better than the fused features of existing methods.
Conclusions
These findings prove the validity of the proposed framework, which leads to better emotion recognition.
期刊介绍:
The Journal of Neuroscience Methods publishes papers that describe new methods that are specifically for neuroscience research conducted in invertebrates, vertebrates or in man. Major methodological improvements or important refinements of established neuroscience methods are also considered for publication. The Journal''s Scope includes all aspects of contemporary neuroscience research, including anatomical, behavioural, biochemical, cellular, computational, molecular, invasive and non-invasive imaging, optogenetic, and physiological research investigations.