Zixuan Li, Pengfei Shen, Hanxiao Sun, Zibo Zhang, Yu Guo, Ligang Liu, Ling-Qi Yan, Steve Marschner, Milos Hasan, Beibei Wang
{"title":"Fiber-level Woven Fabric Capture from a Single Photo","authors":"Zixuan Li, Pengfei Shen, Hanxiao Sun, Zibo Zhang, Yu Guo, Ligang Liu, Ling-Qi Yan, Steve Marschner, Milos Hasan, Beibei Wang","doi":"arxiv-2409.06368","DOIUrl":null,"url":null,"abstract":"Accurately rendering the appearance of fabrics is challenging, due to their\ncomplex 3D microstructures and specialized optical properties. If we model the\ngeometry and optics of fabrics down to the fiber level, we can achieve\nunprecedented rendering realism, but this raises the difficulty of authoring or\ncapturing the fiber-level assets. Existing approaches can obtain fiber-level\ngeometry with special devices (e.g., CT) or complex hand-designed procedural\npipelines (manually tweaking a set of parameters). In this paper, we propose a\nunified framework to capture fiber-level geometry and appearance of woven\nfabrics using a single low-cost microscope image. We first use a simple neural\nnetwork to predict initial parameters of our geometric and appearance models.\nFrom this starting point, we further optimize the parameters of procedural\nfiber geometry and an approximated shading model via differentiable\nrasterization to match the microscope photo more accurately. Finally, we refine\nthe fiber appearance parameters via differentiable path tracing, converging to\naccurate fiber optical parameters, which are suitable for physically-based\nlight simulations to produce high-quality rendered results. We believe that our\nmethod is the first to utilize differentiable rendering at the microscopic\nlevel, supporting physically-based scattering from explicit fiber assemblies.\nOur fabric parameter estimation achieves high-quality re-rendering of measured\nwoven fabric samples in both distant and close-up views. These results can\nfurther be used for efficient rendering or converted to downstream\nrepresentations. We also propose a patch-space fiber geometry procedural\ngeneration and a two-scale path tracing framework for efficient rendering of\nfabric scenes.","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06368","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Accurately rendering the appearance of fabrics is challenging, due to their
complex 3D microstructures and specialized optical properties. If we model the
geometry and optics of fabrics down to the fiber level, we can achieve
unprecedented rendering realism, but this raises the difficulty of authoring or
capturing the fiber-level assets. Existing approaches can obtain fiber-level
geometry with special devices (e.g., CT) or complex hand-designed procedural
pipelines (manually tweaking a set of parameters). In this paper, we propose a
unified framework to capture fiber-level geometry and appearance of woven
fabrics using a single low-cost microscope image. We first use a simple neural
network to predict initial parameters of our geometric and appearance models.
From this starting point, we further optimize the parameters of procedural
fiber geometry and an approximated shading model via differentiable
rasterization to match the microscope photo more accurately. Finally, we refine
the fiber appearance parameters via differentiable path tracing, converging to
accurate fiber optical parameters, which are suitable for physically-based
light simulations to produce high-quality rendered results. We believe that our
method is the first to utilize differentiable rendering at the microscopic
level, supporting physically-based scattering from explicit fiber assemblies.
Our fabric parameter estimation achieves high-quality re-rendering of measured
woven fabric samples in both distant and close-up views. These results can
further be used for efficient rendering or converted to downstream
representations. We also propose a patch-space fiber geometry procedural
generation and a two-scale path tracing framework for efficient rendering of
fabric scenes.