关键渲染参数对Facebook 360度环绕相机拍摄的VR图像感知质量的相对影响

Nora Pfund, Nitin Sampat, J. Viggiano
{"title":"关键渲染参数对Facebook 360度环绕相机拍摄的VR图像感知质量的相对影响","authors":"Nora Pfund, Nitin Sampat, J. Viggiano","doi":"10.2352/issn.2470-1173.2018.05.pmii-183","DOIUrl":null,"url":null,"abstract":"High quality, 360 capture for Cinematic VR is a relatively new and rapidly evolving technology. The field demands very high quality, distortion- free 360 capture which is not possible with cameras that depend on fish- eye lenses for capturing a 360 field of view. The Facebook Surround 360 Camera, one of the few “players” in this space, is an open-source license design that Facebook has released for anyone that chooses to build it from off-the-shelf components and generate 8K stereo output using open-source licensed rendering software. However, the components are expensive and the system itself is extremely demanding in terms of computer hardware and software. Because of this, there have been very few implementations of this design and virtually no real deployment in the field. We have implemented the system, based on Facebook’s design, and have been testing and deploying it in various situations; even generating short video clips. We have discovered in our recent experience that high quality, 360 capture comes with its own set of new challenges. As an example, even the most fundamental tools of photography like “exposure” become difficult because one is always faced with ultra-high dynamic range scenes (one camera is pointing directly at the sun and the others may be pointing to a dark shadow). The conventional imaging pipeline is further complicated by the fact that the stitching software has different effects on various as- pects of the calibration or pipeline optimization. Most of our focus to date has been on optimizing the imaging pipeline and improving the qual- ity of the output for viewing in an Oculus Rift headset. We designed a controlled experiment to study 5 key parameters in the rendering pipeline– black level, neutral balance, color correction matrix (CCM), geometric calibration and vignetting. By varying all of these parameters in a combinatorial manner, we were able to assess the relative impact of these parameters on the perceived image quality of the output. Our results thus far indicate that the output image quality is greatly influenced by the black level of the individual cameras (the Facebook cam- era comprised of 17 cameras whose output need to be stitched to obtain a 360 view). Neutral balance is least sensitive. We are most confused about the results we obtain from accurately calculating and applying the CCM for each individual camera. We obtained improved results by using the average of the matrices for all cameras. Future work includes evaluating the effects of geometric calibration and vignetting on quality.","PeriodicalId":309050,"journal":{"name":"Photography, Mobile, and Immersive Imaging","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Relative Impact of Key Rendering Parameters on Perceived Quality of VR Imagery Captured by the Facebook Surround 360 Camera\",\"authors\":\"Nora Pfund, Nitin Sampat, J. Viggiano\",\"doi\":\"10.2352/issn.2470-1173.2018.05.pmii-183\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"High quality, 360 capture for Cinematic VR is a relatively new and rapidly evolving technology. The field demands very high quality, distortion- free 360 capture which is not possible with cameras that depend on fish- eye lenses for capturing a 360 field of view. The Facebook Surround 360 Camera, one of the few “players” in this space, is an open-source license design that Facebook has released for anyone that chooses to build it from off-the-shelf components and generate 8K stereo output using open-source licensed rendering software. However, the components are expensive and the system itself is extremely demanding in terms of computer hardware and software. Because of this, there have been very few implementations of this design and virtually no real deployment in the field. We have implemented the system, based on Facebook’s design, and have been testing and deploying it in various situations; even generating short video clips. We have discovered in our recent experience that high quality, 360 capture comes with its own set of new challenges. As an example, even the most fundamental tools of photography like “exposure” become difficult because one is always faced with ultra-high dynamic range scenes (one camera is pointing directly at the sun and the others may be pointing to a dark shadow). The conventional imaging pipeline is further complicated by the fact that the stitching software has different effects on various as- pects of the calibration or pipeline optimization. Most of our focus to date has been on optimizing the imaging pipeline and improving the qual- ity of the output for viewing in an Oculus Rift headset. We designed a controlled experiment to study 5 key parameters in the rendering pipeline– black level, neutral balance, color correction matrix (CCM), geometric calibration and vignetting. By varying all of these parameters in a combinatorial manner, we were able to assess the relative impact of these parameters on the perceived image quality of the output. Our results thus far indicate that the output image quality is greatly influenced by the black level of the individual cameras (the Facebook cam- era comprised of 17 cameras whose output need to be stitched to obtain a 360 view). Neutral balance is least sensitive. We are most confused about the results we obtain from accurately calculating and applying the CCM for each individual camera. We obtained improved results by using the average of the matrices for all cameras. Future work includes evaluating the effects of geometric calibration and vignetting on quality.\",\"PeriodicalId\":309050,\"journal\":{\"name\":\"Photography, Mobile, and Immersive Imaging\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-01-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Photography, Mobile, and Immersive Imaging\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2352/issn.2470-1173.2018.05.pmii-183\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Photography, Mobile, and Immersive Imaging","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2352/issn.2470-1173.2018.05.pmii-183","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

高质量,360度捕捉电影VR是一项相对较新的和快速发展的技术。该领域需要非常高的质量,无失真的360度捕捉,这是不可能的相机,依靠鱼眼镜头捕捉360度视野。Facebook 360全景相机是这个领域为数不多的“播放器”之一,是Facebook发布的一个开源许可设计,任何人都可以选择从现成的组件中构建它,并使用开源许可的渲染软件生成8K立体声输出。然而,这些组件价格昂贵,而且系统本身对计算机硬件和软件的要求极高。正因为如此,这种设计的实现很少,实际上也没有在现场实际部署。我们已经根据Facebook的设计实现了这个系统,并在各种情况下进行了测试和部署;甚至生成短视频剪辑。我们在最近的经验中发现,高质量的360度捕捉伴随着一系列新的挑战。举个例子,即使是像“曝光”这样最基本的摄影工具也会变得困难,因为人们总是面对超高动态范围的场景(一个相机直接指向太阳,而其他相机可能指向阴影)。由于拼接软件对标定或优化管道各方面的影响不同,使得传统的成像管道更加复杂。到目前为止,我们的大部分重点都放在优化成像管道和提高在Oculus Rift耳机中观看的输出质量上。我们设计了一个对照实验,研究了渲染管道中的5个关键参数——黑色水平、中性平衡、颜色校正矩阵(CCM)、几何校准和渐晕。通过以组合方式改变所有这些参数,我们能够评估这些参数对输出的感知图像质量的相对影响。到目前为止,我们的结果表明,输出图像质量受到单个相机的黑电平的极大影响(Facebook相机时代由17个相机组成,其输出需要缝合以获得360度视图)。中性平衡最不敏感。我们最困惑的是我们从精确计算和应用CCM为每个单独的相机得到的结果。通过对所有相机的矩阵求平均值,得到了改进的结果。未来的工作包括评估几何校准和渐晕对质量的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Relative Impact of Key Rendering Parameters on Perceived Quality of VR Imagery Captured by the Facebook Surround 360 Camera
High quality, 360 capture for Cinematic VR is a relatively new and rapidly evolving technology. The field demands very high quality, distortion- free 360 capture which is not possible with cameras that depend on fish- eye lenses for capturing a 360 field of view. The Facebook Surround 360 Camera, one of the few “players” in this space, is an open-source license design that Facebook has released for anyone that chooses to build it from off-the-shelf components and generate 8K stereo output using open-source licensed rendering software. However, the components are expensive and the system itself is extremely demanding in terms of computer hardware and software. Because of this, there have been very few implementations of this design and virtually no real deployment in the field. We have implemented the system, based on Facebook’s design, and have been testing and deploying it in various situations; even generating short video clips. We have discovered in our recent experience that high quality, 360 capture comes with its own set of new challenges. As an example, even the most fundamental tools of photography like “exposure” become difficult because one is always faced with ultra-high dynamic range scenes (one camera is pointing directly at the sun and the others may be pointing to a dark shadow). The conventional imaging pipeline is further complicated by the fact that the stitching software has different effects on various as- pects of the calibration or pipeline optimization. Most of our focus to date has been on optimizing the imaging pipeline and improving the qual- ity of the output for viewing in an Oculus Rift headset. We designed a controlled experiment to study 5 key parameters in the rendering pipeline– black level, neutral balance, color correction matrix (CCM), geometric calibration and vignetting. By varying all of these parameters in a combinatorial manner, we were able to assess the relative impact of these parameters on the perceived image quality of the output. Our results thus far indicate that the output image quality is greatly influenced by the black level of the individual cameras (the Facebook cam- era comprised of 17 cameras whose output need to be stitched to obtain a 360 view). Neutral balance is least sensitive. We are most confused about the results we obtain from accurately calculating and applying the CCM for each individual camera. We obtained improved results by using the average of the matrices for all cameras. Future work includes evaluating the effects of geometric calibration and vignetting on quality.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信