Meral Kuyucu , Mehmet Ali Sarikaya , Tülay Karakaş , Dilek Yıldız Özkan , Yüksel Demir , Ömer Bilen , Gökhan Ince
{"title":"基于传感器融合眼动追踪的虚拟现实情感识别。","authors":"Meral Kuyucu , Mehmet Ali Sarikaya , Tülay Karakaş , Dilek Yıldız Özkan , Yüksel Demir , Ömer Bilen , Gökhan Ince","doi":"10.1016/j.compbiomed.2025.111070","DOIUrl":null,"url":null,"abstract":"<div><div>Emotion recognition is an emerging field with applications in healthcare, education, and entertainment. This study integrates Virtual Reality (VR) with multi-sensor fusion to enhance emotion recognition. The research comprises two phases: data collection and analysis/evaluation. Ninety-five participants were exposed to curated audiovisual stimuli designed to elicit a wide range of emotions within an immersive VR environment. VR was chosen for its ability to provide controlled conditions and overcome the limitations of current mobile sensor technologies. Physiological data streams from various sensors were integrated for comprehensive emotional analysis. ElectroEncephaloGraphy (EEG) data revealed brain activity linked to emotional states, while eye tracking data provided insights into gaze direction, pupil dilation, and eye movement—factors correlated with cognitive and emotional processes. Peripheral signals, including heart rate variability, ElectroDermal Activity (EDA), and body temperature, were captured via wearable sensors to enrich the dataset. Machine learning models, such as XGBoost, CatBoost, Multilayer Perceptron, Gradient Boosting, and LightGBM, were employed to predict participants’ emotional states. Evaluation metrics, including accuracy, precision, recall, and F1 scores, demonstrated the robustness and precision of the proposed VR-based multi-sensor fusion approach. This research presents a novel approach to emotion recognition, bridging gaps in traditional methods by integrating VR, multi-sensor fusion, and machine learning.</div></div>","PeriodicalId":10578,"journal":{"name":"Computers in biology and medicine","volume":"197 ","pages":"Article 111070"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Emotion recognition in Virtual Reality using sensor fusion with eye tracking\",\"authors\":\"Meral Kuyucu , Mehmet Ali Sarikaya , Tülay Karakaş , Dilek Yıldız Özkan , Yüksel Demir , Ömer Bilen , Gökhan Ince\",\"doi\":\"10.1016/j.compbiomed.2025.111070\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Emotion recognition is an emerging field with applications in healthcare, education, and entertainment. This study integrates Virtual Reality (VR) with multi-sensor fusion to enhance emotion recognition. The research comprises two phases: data collection and analysis/evaluation. Ninety-five participants were exposed to curated audiovisual stimuli designed to elicit a wide range of emotions within an immersive VR environment. VR was chosen for its ability to provide controlled conditions and overcome the limitations of current mobile sensor technologies. Physiological data streams from various sensors were integrated for comprehensive emotional analysis. ElectroEncephaloGraphy (EEG) data revealed brain activity linked to emotional states, while eye tracking data provided insights into gaze direction, pupil dilation, and eye movement—factors correlated with cognitive and emotional processes. Peripheral signals, including heart rate variability, ElectroDermal Activity (EDA), and body temperature, were captured via wearable sensors to enrich the dataset. Machine learning models, such as XGBoost, CatBoost, Multilayer Perceptron, Gradient Boosting, and LightGBM, were employed to predict participants’ emotional states. Evaluation metrics, including accuracy, precision, recall, and F1 scores, demonstrated the robustness and precision of the proposed VR-based multi-sensor fusion approach. This research presents a novel approach to emotion recognition, bridging gaps in traditional methods by integrating VR, multi-sensor fusion, and machine learning.</div></div>\",\"PeriodicalId\":10578,\"journal\":{\"name\":\"Computers in biology and medicine\",\"volume\":\"197 \",\"pages\":\"Article 111070\"},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2025-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in biology and medicine\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0010482525014222\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"BIOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in biology and medicine","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0010482525014222","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOLOGY","Score":null,"Total":0}
Emotion recognition in Virtual Reality using sensor fusion with eye tracking
Emotion recognition is an emerging field with applications in healthcare, education, and entertainment. This study integrates Virtual Reality (VR) with multi-sensor fusion to enhance emotion recognition. The research comprises two phases: data collection and analysis/evaluation. Ninety-five participants were exposed to curated audiovisual stimuli designed to elicit a wide range of emotions within an immersive VR environment. VR was chosen for its ability to provide controlled conditions and overcome the limitations of current mobile sensor technologies. Physiological data streams from various sensors were integrated for comprehensive emotional analysis. ElectroEncephaloGraphy (EEG) data revealed brain activity linked to emotional states, while eye tracking data provided insights into gaze direction, pupil dilation, and eye movement—factors correlated with cognitive and emotional processes. Peripheral signals, including heart rate variability, ElectroDermal Activity (EDA), and body temperature, were captured via wearable sensors to enrich the dataset. Machine learning models, such as XGBoost, CatBoost, Multilayer Perceptron, Gradient Boosting, and LightGBM, were employed to predict participants’ emotional states. Evaluation metrics, including accuracy, precision, recall, and F1 scores, demonstrated the robustness and precision of the proposed VR-based multi-sensor fusion approach. This research presents a novel approach to emotion recognition, bridging gaps in traditional methods by integrating VR, multi-sensor fusion, and machine learning.
期刊介绍:
Computers in Biology and Medicine is an international forum for sharing groundbreaking advancements in the use of computers in bioscience and medicine. This journal serves as a medium for communicating essential research, instruction, ideas, and information regarding the rapidly evolving field of computer applications in these domains. By encouraging the exchange of knowledge, we aim to facilitate progress and innovation in the utilization of computers in biology and medicine.