通过自适应焦点显示,为所有用户优化VR

Nitish Padmanaban, Robert Konrad, Emily A. Cooper, Gordon Wetzstein
{"title":"通过自适应焦点显示,为所有用户优化VR","authors":"Nitish Padmanaban, Robert Konrad, Emily A. Cooper, Gordon Wetzstein","doi":"10.1145/3084363.3085029","DOIUrl":null,"url":null,"abstract":"Personal computing devices have evolved steadily, from desktops to mobile devices, and now to emerging trends in wearable computing. Wearables are expected to be integral to consumer electronics, with the primary mode of interaction often being a near-eye display. However, current-generation near-eye displays are unable to provide fully natural focus cues for all users, which often leads to discomfort. This core limitation is due to the optics of the systems themselves, with current displays being unable to change focus as required by natural vision. Furthermore, the form factor often makes it difficult for users to wear corrective eyewear. With two prototype near-eye displays, we address these issues using display modes that adapt to the user via computational optics. These prototypes make use of focus-tunable lenses, mechanically actuated displays, and gaze tracking technology to correct common refractive errors per user, and provide natural focus cues by dynamically updating scene depth based on where a user looks. Recent advances in computational optics hint at a future in which some users experience better vision in the virtual world than in the real one.","PeriodicalId":163368,"journal":{"name":"ACM SIGGRAPH 2017 Talks","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Optimizing VR for all users through adaptive focus displays\",\"authors\":\"Nitish Padmanaban, Robert Konrad, Emily A. Cooper, Gordon Wetzstein\",\"doi\":\"10.1145/3084363.3085029\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Personal computing devices have evolved steadily, from desktops to mobile devices, and now to emerging trends in wearable computing. Wearables are expected to be integral to consumer electronics, with the primary mode of interaction often being a near-eye display. However, current-generation near-eye displays are unable to provide fully natural focus cues for all users, which often leads to discomfort. This core limitation is due to the optics of the systems themselves, with current displays being unable to change focus as required by natural vision. Furthermore, the form factor often makes it difficult for users to wear corrective eyewear. With two prototype near-eye displays, we address these issues using display modes that adapt to the user via computational optics. These prototypes make use of focus-tunable lenses, mechanically actuated displays, and gaze tracking technology to correct common refractive errors per user, and provide natural focus cues by dynamically updating scene depth based on where a user looks. Recent advances in computational optics hint at a future in which some users experience better vision in the virtual world than in the real one.\",\"PeriodicalId\":163368,\"journal\":{\"name\":\"ACM SIGGRAPH 2017 Talks\",\"volume\":\"2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-07-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM SIGGRAPH 2017 Talks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3084363.3085029\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM SIGGRAPH 2017 Talks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3084363.3085029","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

个人计算设备一直在稳步发展,从台式机到移动设备,再到现在的可穿戴计算的新兴趋势。可穿戴设备有望成为消费电子产品不可或缺的一部分,其主要交互模式通常是近眼显示。然而,当前一代的近眼显示器无法为所有用户提供完全自然的焦点提示,这往往会导致不适。这一核心限制是由于系统本身的光学特性,目前的显示器无法按照自然视觉的要求改变焦点。此外,形状因素往往使用户难以佩戴矫正眼镜。通过两个原型近眼显示器,我们使用通过计算光学适应用户的显示模式来解决这些问题。这些原型利用可调焦镜头、机械驱动显示器和凝视跟踪技术来纠正每个用户常见的屈光不正,并根据用户的视线位置动态更新场景深度,从而提供自然的焦点提示。计算光学的最新进展暗示,未来一些用户在虚拟世界中体验到的视觉效果会比在现实世界中更好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Optimizing VR for all users through adaptive focus displays
Personal computing devices have evolved steadily, from desktops to mobile devices, and now to emerging trends in wearable computing. Wearables are expected to be integral to consumer electronics, with the primary mode of interaction often being a near-eye display. However, current-generation near-eye displays are unable to provide fully natural focus cues for all users, which often leads to discomfort. This core limitation is due to the optics of the systems themselves, with current displays being unable to change focus as required by natural vision. Furthermore, the form factor often makes it difficult for users to wear corrective eyewear. With two prototype near-eye displays, we address these issues using display modes that adapt to the user via computational optics. These prototypes make use of focus-tunable lenses, mechanically actuated displays, and gaze tracking technology to correct common refractive errors per user, and provide natural focus cues by dynamically updating scene depth based on where a user looks. Recent advances in computational optics hint at a future in which some users experience better vision in the virtual world than in the real one.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信