{"title":"BRDF-NeRF: Neural Radiance Fields with Optical Satellite Images and BRDF Modelling","authors":"Lulin Zhang, Ewelina Rupnik, Tri Dung Nguyen, Stéphane Jacquemoud, Yann Klinger","doi":"arxiv-2409.12014","DOIUrl":null,"url":null,"abstract":"Understanding the anisotropic reflectance of complex Earth surfaces from\nsatellite imagery is crucial for numerous applications. Neural radiance fields\n(NeRF) have become popular as a machine learning technique capable of deducing\nthe bidirectional reflectance distribution function (BRDF) of a scene from\nmultiple images. However, prior research has largely concentrated on applying\nNeRF to close-range imagery, estimating basic Microfacet BRDF models, which\nfall short for many Earth surfaces. Moreover, high-quality NeRFs generally\nrequire several images captured simultaneously, a rare occurrence in satellite\nimaging. To address these limitations, we propose BRDF-NeRF, developed to\nexplicitly estimate the Rahman-Pinty-Verstraete (RPV) model, a semi-empirical\nBRDF model commonly employed in remote sensing. We assess our approach using\ntwo datasets: (1) Djibouti, captured in a single epoch at varying viewing\nangles with a fixed Sun position, and (2) Lanzhou, captured over multiple\nepochs with different viewing angles and Sun positions. Our results, based on\nonly three to four satellite images for training, demonstrate that BRDF-NeRF\ncan effectively synthesize novel views from directions far removed from the\ntraining data and produce high-quality digital surface models (DSMs).","PeriodicalId":501130,"journal":{"name":"arXiv - CS - Computer Vision and Pattern Recognition","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computer Vision and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.12014","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Understanding the anisotropic reflectance of complex Earth surfaces from
satellite imagery is crucial for numerous applications. Neural radiance fields
(NeRF) have become popular as a machine learning technique capable of deducing
the bidirectional reflectance distribution function (BRDF) of a scene from
multiple images. However, prior research has largely concentrated on applying
NeRF to close-range imagery, estimating basic Microfacet BRDF models, which
fall short for many Earth surfaces. Moreover, high-quality NeRFs generally
require several images captured simultaneously, a rare occurrence in satellite
imaging. To address these limitations, we propose BRDF-NeRF, developed to
explicitly estimate the Rahman-Pinty-Verstraete (RPV) model, a semi-empirical
BRDF model commonly employed in remote sensing. We assess our approach using
two datasets: (1) Djibouti, captured in a single epoch at varying viewing
angles with a fixed Sun position, and (2) Lanzhou, captured over multiple
epochs with different viewing angles and Sun positions. Our results, based on
only three to four satellite images for training, demonstrate that BRDF-NeRF
can effectively synthesize novel views from directions far removed from the
training data and produce high-quality digital surface models (DSMs).