{"title":"RayGauss: Volumetric Gaussian-Based Ray Casting for Photorealistic Novel View Synthesis","authors":"Hugo Blanc, Jean-Emmanuel Deschaud, Alexis Paljic","doi":"arxiv-2408.03356","DOIUrl":null,"url":null,"abstract":"Differentiable volumetric rendering-based methods made significant progress\nin novel view synthesis. On one hand, innovative methods have replaced the\nNeural Radiance Fields (NeRF) network with locally parameterized structures,\nenabling high-quality renderings in a reasonable time. On the other hand,\napproaches have used differentiable splatting instead of NeRF's ray casting to\noptimize radiance fields rapidly using Gaussian kernels, allowing for fine\nadaptation to the scene. However, differentiable ray casting of irregularly\nspaced kernels has been scarcely explored, while splatting, despite enabling\nfast rendering times, is susceptible to clearly visible artifacts. Our work closes this gap by providing a physically consistent formulation of\nthe emitted radiance c and density {\\sigma}, decomposed with Gaussian functions\nassociated with Spherical Gaussians/Harmonics for all-frequency colorimetric\nrepresentation. We also introduce a method enabling differentiable ray casting\nof irregularly distributed Gaussians using an algorithm that integrates\nradiance fields slab by slab and leverages a BVH structure. This allows our\napproach to finely adapt to the scene while avoiding splatting artifacts. As a\nresult, we achieve superior rendering quality compared to the state-of-the-art\nwhile maintaining reasonable training times and achieving inference speeds of\n25 FPS on the Blender dataset. Project page with videos and code:\nhttps://raygauss.github.io/","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":"41 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.03356","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Differentiable volumetric rendering-based methods made significant progress
in novel view synthesis. On one hand, innovative methods have replaced the
Neural Radiance Fields (NeRF) network with locally parameterized structures,
enabling high-quality renderings in a reasonable time. On the other hand,
approaches have used differentiable splatting instead of NeRF's ray casting to
optimize radiance fields rapidly using Gaussian kernels, allowing for fine
adaptation to the scene. However, differentiable ray casting of irregularly
spaced kernels has been scarcely explored, while splatting, despite enabling
fast rendering times, is susceptible to clearly visible artifacts. Our work closes this gap by providing a physically consistent formulation of
the emitted radiance c and density {\sigma}, decomposed with Gaussian functions
associated with Spherical Gaussians/Harmonics for all-frequency colorimetric
representation. We also introduce a method enabling differentiable ray casting
of irregularly distributed Gaussians using an algorithm that integrates
radiance fields slab by slab and leverages a BVH structure. This allows our
approach to finely adapt to the scene while avoiding splatting artifacts. As a
result, we achieve superior rendering quality compared to the state-of-the-art
while maintaining reasonable training times and achieving inference speeds of
25 FPS on the Blender dataset. Project page with videos and code:
https://raygauss.github.io/