{"title":"Bidirectional projective sampling for physics-based differentiable rendering","authors":"Ruicheng Gao , Yue Qi","doi":"10.1016/j.vrih.2025.05.001","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Physics-based differentiable rendering (PBDR) aims to propagate gradients from scene parameters to image pixels or vice versa. The physically correct gradients obtained can be used in various applications, including inverse rendering and machine learning. Currently, two categories of methods are prevalent in the PBDR community: reparameterization and boundary sampling methods. The state-of-the-art boundary sampling methods rely on a guiding structure to calculate the gradients efficiently. They utilize the rays generated in traditional path-tracing methods and project them onto the object silhouette boundary to initialize the guiding structure.</div></div><div><h3>Methods</h3><div>In this study, we propose an augmentation of previous projective-sampling-based boundary-sampling methods in a bidirectional manner. Specifically, we utilize the rays spawned from the sensors and also employ the rays emitted by the emitters to initialize the guiding structure.</div></div><div><h3>Results</h3><div>To demonstrate the benefits of our technique, we perform a comparative analysis of differentiable rendering and inverse rendering performance. We utilize a range of synthetic scene examples and evaluate our method against state-of-the-art projective-sampling-based differentiable rendering methods.</div></div><div><h3>Conclusions</h3><div>The experiments show that our method achieves lower variance gradients in the forward differentiable rendering process and better geometry reconstruction quality in the inverse-rendering results.</div></div>","PeriodicalId":33538,"journal":{"name":"Virtual Reality Intelligent Hardware","volume":"7 4","pages":"Pages 367-378"},"PeriodicalIF":0.0000,"publicationDate":"2025-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Virtual Reality Intelligent Hardware","FirstCategoryId":"1093","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2096579625000324","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Physics-based differentiable rendering (PBDR) aims to propagate gradients from scene parameters to image pixels or vice versa. The physically correct gradients obtained can be used in various applications, including inverse rendering and machine learning. Currently, two categories of methods are prevalent in the PBDR community: reparameterization and boundary sampling methods. The state-of-the-art boundary sampling methods rely on a guiding structure to calculate the gradients efficiently. They utilize the rays generated in traditional path-tracing methods and project them onto the object silhouette boundary to initialize the guiding structure.
Methods
In this study, we propose an augmentation of previous projective-sampling-based boundary-sampling methods in a bidirectional manner. Specifically, we utilize the rays spawned from the sensors and also employ the rays emitted by the emitters to initialize the guiding structure.
Results
To demonstrate the benefits of our technique, we perform a comparative analysis of differentiable rendering and inverse rendering performance. We utilize a range of synthetic scene examples and evaluate our method against state-of-the-art projective-sampling-based differentiable rendering methods.
Conclusions
The experiments show that our method achieves lower variance gradients in the forward differentiable rendering process and better geometry reconstruction quality in the inverse-rendering results.