无重投影VR直通

Grace Kuo, Eric Penner, Seth Moczydlowski, Alexander Ching, Douglas Lanman, N. Matsuda
{"title":"无重投影VR直通","authors":"Grace Kuo, Eric Penner, Seth Moczydlowski, Alexander Ching, Douglas Lanman, N. Matsuda","doi":"10.1145/3588037.3595391","DOIUrl":null,"url":null,"abstract":"Virtual reality (VR) passthrough uses external cameras on the front of a headset to allow the user to see their environment. However, passthrough cameras cannot physically be co-located with the user’s eyes, so the passthrough images have a different perspective than what the user would see without the headset. Although the images can be computationally reprojected into the desired view, errors in depth estimation and missing information at occlusion boundaries can lead to undesirable artifacts. We propose a novel computational camera that directly samples the rays that would have gone into the user’s eye, several centimeters behind the sensor. Our design contains an array of lenses with an aperture behind each lens, and the apertures are strategically placed to allow through only the desired rays. The resulting thin, flat architecture has suitable form factor for VR, and the image reconstruction is computationally lightweight, enabling low-latency passthrough. We demonstrate our approach experimentally in a fully functional binocular passthrough prototype with practical calibration and real-time image reconstruction.","PeriodicalId":348151,"journal":{"name":"ACM SIGGRAPH 2023 Emerging Technologies","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Reprojection-Free VR Passthrough\",\"authors\":\"Grace Kuo, Eric Penner, Seth Moczydlowski, Alexander Ching, Douglas Lanman, N. Matsuda\",\"doi\":\"10.1145/3588037.3595391\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Virtual reality (VR) passthrough uses external cameras on the front of a headset to allow the user to see their environment. However, passthrough cameras cannot physically be co-located with the user’s eyes, so the passthrough images have a different perspective than what the user would see without the headset. Although the images can be computationally reprojected into the desired view, errors in depth estimation and missing information at occlusion boundaries can lead to undesirable artifacts. We propose a novel computational camera that directly samples the rays that would have gone into the user’s eye, several centimeters behind the sensor. Our design contains an array of lenses with an aperture behind each lens, and the apertures are strategically placed to allow through only the desired rays. The resulting thin, flat architecture has suitable form factor for VR, and the image reconstruction is computationally lightweight, enabling low-latency passthrough. We demonstrate our approach experimentally in a fully functional binocular passthrough prototype with practical calibration and real-time image reconstruction.\",\"PeriodicalId\":348151,\"journal\":{\"name\":\"ACM SIGGRAPH 2023 Emerging Technologies\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM SIGGRAPH 2023 Emerging Technologies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3588037.3595391\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM SIGGRAPH 2023 Emerging Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3588037.3595391","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

虚拟现实(VR)通过使用耳机前面的外部摄像头让用户看到他们的环境。然而,穿透式摄像头不能与用户的眼睛同时放置,因此穿透式图像的视角与用户不戴耳机时看到的视角不同。虽然图像可以通过计算重新投影到期望的视图中,但深度估计的错误和遮挡边界处的信息缺失可能导致不期望的伪影。我们提出了一种新型的计算相机,它可以直接对传感器后面几厘米处进入用户眼睛的光线进行采样。我们的设计包含一组镜头,每个镜头后面都有一个光圈,并且光圈的位置很有策略,只允许所需的光线通过。由此产生的薄而扁平的架构具有适合VR的外形因素,并且图像重建在计算上是轻量级的,可以实现低延迟的透传。我们在一个具有实际校准和实时图像重建功能的全功能双目穿透原型中实验证明了我们的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Reprojection-Free VR Passthrough
Virtual reality (VR) passthrough uses external cameras on the front of a headset to allow the user to see their environment. However, passthrough cameras cannot physically be co-located with the user’s eyes, so the passthrough images have a different perspective than what the user would see without the headset. Although the images can be computationally reprojected into the desired view, errors in depth estimation and missing information at occlusion boundaries can lead to undesirable artifacts. We propose a novel computational camera that directly samples the rays that would have gone into the user’s eye, several centimeters behind the sensor. Our design contains an array of lenses with an aperture behind each lens, and the apertures are strategically placed to allow through only the desired rays. The resulting thin, flat architecture has suitable form factor for VR, and the image reconstruction is computationally lightweight, enabling low-latency passthrough. We demonstrate our approach experimentally in a fully functional binocular passthrough prototype with practical calibration and real-time image reconstruction.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信