Efstratia Ganiti-Roumeliotou, Ioannis Ziogas, Sofia B Dias, Ghada Alhussein, Herbert F Jelinek, Leontios J Hadjileontiadis
{"title":"Beyond the Game: Multimodal Emotion Recognition Before, During, and After Gameplay.","authors":"Efstratia Ganiti-Roumeliotou, Ioannis Ziogas, Sofia B Dias, Ghada Alhussein, Herbert F Jelinek, Leontios J Hadjileontiadis","doi":"10.1109/EMBC53108.2024.10782547","DOIUrl":null,"url":null,"abstract":"<p><p>In the era of Human-Computer Interaction (HCI), understanding emotional responses through multimodal signals during interactive experiences, such as serious games (SG), is of high importance. In this work, we explore emotion recognition (ER) by analyzing multimodal data from the 2nd Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems (BIRAFFE-2) dataset, including data from 76 participants engaged in dynamic gameplay and pre-post audiovisual stimulations. Utilizing features derived from electrocardiogram (ECG), electrodermal activity (EDA), accelerometer, gyroscope, game logs (GL), affect dynamics and personality traits (PT) fed in different machine learning models, our study focuses on ER, achieving state-of-the-art performance across different experimental scenarios (accuracy: 0.967 for Negative Affect in Optimal Game using Support Vector Machines). This highlights the importance of emotional states as indicators for personalized HCI. Our approach offers valuable insights to understanding the interplay between multimodal physiological signals, GL, user's emotional states and PT, which could add to the design of adaptive, affect-sensitive SG. Distinct patterns in the data are revealed, particularly emphasizing the role of ECG-Derived Respiration features and the impact of past affectivity to current emotional state.Clinical relevance-By introducing innovative perspectives in affect-sensitive SG design, leveraging the analysis of multimodal signals, we foresee objective digital biomarkers that hold promise to broaden the clinical understanding of patients' emotional behavior during SG-based interventions.</p>","PeriodicalId":72237,"journal":{"name":"Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference","volume":"2024 ","pages":"1-7"},"PeriodicalIF":0.0000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EMBC53108.2024.10782547","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In the era of Human-Computer Interaction (HCI), understanding emotional responses through multimodal signals during interactive experiences, such as serious games (SG), is of high importance. In this work, we explore emotion recognition (ER) by analyzing multimodal data from the 2nd Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems (BIRAFFE-2) dataset, including data from 76 participants engaged in dynamic gameplay and pre-post audiovisual stimulations. Utilizing features derived from electrocardiogram (ECG), electrodermal activity (EDA), accelerometer, gyroscope, game logs (GL), affect dynamics and personality traits (PT) fed in different machine learning models, our study focuses on ER, achieving state-of-the-art performance across different experimental scenarios (accuracy: 0.967 for Negative Affect in Optimal Game using Support Vector Machines). This highlights the importance of emotional states as indicators for personalized HCI. Our approach offers valuable insights to understanding the interplay between multimodal physiological signals, GL, user's emotional states and PT, which could add to the design of adaptive, affect-sensitive SG. Distinct patterns in the data are revealed, particularly emphasizing the role of ECG-Derived Respiration features and the impact of past affectivity to current emotional state.Clinical relevance-By introducing innovative perspectives in affect-sensitive SG design, leveraging the analysis of multimodal signals, we foresee objective digital biomarkers that hold promise to broaden the clinical understanding of patients' emotional behavior during SG-based interventions.