{"title":"DynaSurfGS: Dynamic Surface Reconstruction with Planar-based Gaussian Splatting","authors":"Weiwei Cai, Weicai Ye, Peng Ye, Tong He, Tao Chen","doi":"arxiv-2408.13972","DOIUrl":null,"url":null,"abstract":"Dynamic scene reconstruction has garnered significant attention in recent\nyears due to its capabilities in high-quality and real-time rendering. Among\nvarious methodologies, constructing a 4D spatial-temporal representation, such\nas 4D-GS, has gained popularity for its high-quality rendered images. However,\nthese methods often produce suboptimal surfaces, as the discrete 3D Gaussian\npoint clouds fail to align with the object's surface precisely. To address this\nproblem, we propose DynaSurfGS to achieve both photorealistic rendering and\nhigh-fidelity surface reconstruction of dynamic scenarios. Specifically, the\nDynaSurfGS framework first incorporates Gaussian features from 4D neural voxels\nwith the planar-based Gaussian Splatting to facilitate precise surface\nreconstruction. It leverages normal regularization to enforce the smoothness of\nthe surface of dynamic objects. It also incorporates the as-rigid-as-possible\n(ARAP) constraint to maintain the approximate rigidity of local neighborhoods\nof 3D Gaussians between timesteps and ensure that adjacent 3D Gaussians remain\nclosely aligned throughout. Extensive experiments demonstrate that DynaSurfGS\nsurpasses state-of-the-art methods in both high-fidelity surface reconstruction\nand photorealistic rendering.","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":"30 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.13972","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Dynamic scene reconstruction has garnered significant attention in recent
years due to its capabilities in high-quality and real-time rendering. Among
various methodologies, constructing a 4D spatial-temporal representation, such
as 4D-GS, has gained popularity for its high-quality rendered images. However,
these methods often produce suboptimal surfaces, as the discrete 3D Gaussian
point clouds fail to align with the object's surface precisely. To address this
problem, we propose DynaSurfGS to achieve both photorealistic rendering and
high-fidelity surface reconstruction of dynamic scenarios. Specifically, the
DynaSurfGS framework first incorporates Gaussian features from 4D neural voxels
with the planar-based Gaussian Splatting to facilitate precise surface
reconstruction. It leverages normal regularization to enforce the smoothness of
the surface of dynamic objects. It also incorporates the as-rigid-as-possible
(ARAP) constraint to maintain the approximate rigidity of local neighborhoods
of 3D Gaussians between timesteps and ensure that adjacent 3D Gaussians remain
closely aligned throughout. Extensive experiments demonstrate that DynaSurfGS
surpasses state-of-the-art methods in both high-fidelity surface reconstruction
and photorealistic rendering.