{"title":"Using Context and Physiological Cues to Improve Emotion Recognition in Virtual Reality","authors":"Kunal Gupta","doi":"10.1109/ISMAR-Adjunct54149.2021.00105","DOIUrl":null,"url":null,"abstract":"Immersive Virtual Reality (VR) can create compelling context-specific emotional experiences, but very few studies explore the importance of emotion-relevant contextual cues in VR. In this thesis, we will investigate how to use combined contextual and physiological cues to improve emotion recognition in VR with a view of enhancing shared VR experiences. We will explore how relatively low-cost sensors can capture physiological information and how this can be combined with user and environmental context cues to more accurately measure emotion. The main novelty of the research is that it will demonstrate the first Personalized Real-time Emotion-Adaptive Context-Aware VR (PerAffectly VR) system and provide significant insight into how to create, measure, and share emotional VR experiences. The research will be helpful in multiple domains such as tourism, training, entertainment, and gaming. It will also enable the creation of VR interfaces that automatically adapt to the user’s emotional needs and provide a better user experience.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00105","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Immersive Virtual Reality (VR) can create compelling context-specific emotional experiences, but very few studies explore the importance of emotion-relevant contextual cues in VR. In this thesis, we will investigate how to use combined contextual and physiological cues to improve emotion recognition in VR with a view of enhancing shared VR experiences. We will explore how relatively low-cost sensors can capture physiological information and how this can be combined with user and environmental context cues to more accurately measure emotion. The main novelty of the research is that it will demonstrate the first Personalized Real-time Emotion-Adaptive Context-Aware VR (PerAffectly VR) system and provide significant insight into how to create, measure, and share emotional VR experiences. The research will be helpful in multiple domains such as tourism, training, entertainment, and gaming. It will also enable the creation of VR interfaces that automatically adapt to the user’s emotional needs and provide a better user experience.