An evaluation of pupillary light response models for 2D screens and VR HMDs

Brendan David-John, Pallavi Raiturkar, Arunava Banerjee, Eakta Jain
{"title":"An evaluation of pupillary light response models for 2D screens and VR HMDs","authors":"Brendan David-John, Pallavi Raiturkar, Arunava Banerjee, Eakta Jain","doi":"10.1145/3281505.3281538","DOIUrl":null,"url":null,"abstract":"Pupil diameter changes have been shown to be indicative of user engagement and cognitive load for various tasks and environments. However, it is still not the preferred physiological measure for applied settings. This reluctance to leverage the pupil as an index of user engagement stems from the problem that in scenarios where scene brightness cannot be controlled, the pupil light response confounds the cognitive-emotional response. What if we could predict the light response of an individual's pupil, thus creating the opportunity to factor it out of the measurement? In this work, we lay the groundwork for this research by evaluating three models of pupillary light response in 2D, and in a virtual reality (VR) environment. Our results show that either a linear or an exponential model can be fit to an individual participant with an easy-to-use calibration procedure. This work opens several new research directions in VR relating to performance analysis and inspires the use of eye tracking beyond gaze as a pointer and foveated rendering.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"46 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3281505.3281538","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 18

Abstract

Pupil diameter changes have been shown to be indicative of user engagement and cognitive load for various tasks and environments. However, it is still not the preferred physiological measure for applied settings. This reluctance to leverage the pupil as an index of user engagement stems from the problem that in scenarios where scene brightness cannot be controlled, the pupil light response confounds the cognitive-emotional response. What if we could predict the light response of an individual's pupil, thus creating the opportunity to factor it out of the measurement? In this work, we lay the groundwork for this research by evaluating three models of pupillary light response in 2D, and in a virtual reality (VR) environment. Our results show that either a linear or an exponential model can be fit to an individual participant with an easy-to-use calibration procedure. This work opens several new research directions in VR relating to performance analysis and inspires the use of eye tracking beyond gaze as a pointer and foveated rendering.
2D屏幕和VR头显的瞳孔光响应模型评价
瞳孔直径的变化已被证明是用户参与和各种任务和环境的认知负荷的指示。然而,它仍然不是应用环境的首选生理测量。这种不愿意利用瞳孔作为用户参与度的指标源于这样一个问题,即在场景亮度无法控制的情况下,瞳孔光线反应会混淆认知情绪反应。如果我们能预测个人瞳孔的光响应,从而创造机会将其排除在测量之外,那会怎么样?在这项工作中,我们通过评估二维和虚拟现实(VR)环境下瞳孔光响应的三种模型,为这项研究奠定了基础。我们的结果表明,线性或指数模型都可以适合于个体参与者,并且易于使用校准程序。这项工作开辟了VR中与性能分析相关的几个新的研究方向,并激发了眼球追踪的使用,而不仅仅是凝视作为指针和注视点渲染。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信