M. Pszeida, Amir Dini, M. Schneeberger, M. Lenger, L. Paletta, S. Russegger, S. Reidl, S. Beranek, B. ., Msc ., Sandra Schuessler, A. Haeussl, B. ., R. Hartmann, Martin Sighart, Sebastian Mayer, Patricia Papic, Beatrix Koch, Hermine Fürli
{"title":"Estimation of Change in Affective State Using Eye Tracking Features from Virtual Reality Technologies","authors":"M. Pszeida, Amir Dini, M. Schneeberger, M. Lenger, L. Paletta, S. Russegger, S. Reidl, S. Beranek, B. ., Msc ., Sandra Schuessler, A. Haeussl, B. ., R. Hartmann, Martin Sighart, Sebastian Mayer, Patricia Papic, Beatrix Koch, Hermine Fürli","doi":"10.54941/ahfe1001843","DOIUrl":null,"url":null,"abstract":"Affective states play a prominent role in the context of human activation and motivation. Immersive VR-based presence provides opportunities to activate elderly people in the context of preferred leisure activities (Häussl et al., 2021) or to apply mindfulness interventions for their cognitive reserve (Paletta et al., 2021). The appropriate design of positively activating content is pivotal for appropriate changes in users’ affective states. The presented study provided insight into the potential of non-invasive VR-based eye tracking for automated estimation of affective state induced by video content, in an explorative pilot study with seven elderly persons living in a nursing home. The results indicate the feasibility of estimating mood change from typical eye movement features, such as, fixation duration and pupil diameter, as a promising future research topic.","PeriodicalId":285612,"journal":{"name":"Cognitive Computing and Internet of Things","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Computing and Internet of Things","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54941/ahfe1001843","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Affective states play a prominent role in the context of human activation and motivation. Immersive VR-based presence provides opportunities to activate elderly people in the context of preferred leisure activities (Häussl et al., 2021) or to apply mindfulness interventions for their cognitive reserve (Paletta et al., 2021). The appropriate design of positively activating content is pivotal for appropriate changes in users’ affective states. The presented study provided insight into the potential of non-invasive VR-based eye tracking for automated estimation of affective state induced by video content, in an explorative pilot study with seven elderly persons living in a nursing home. The results indicate the feasibility of estimating mood change from typical eye movement features, such as, fixation duration and pupil diameter, as a promising future research topic.