基于神经双翼表示的BTF绘制与获取

Jiahui Fan, Beibei Wang, Miloš Hašan, Jian Yang, Ling-Qi Yan
{"title":"基于神经双翼表示的BTF绘制与获取","authors":"Jiahui Fan, Beibei Wang, Miloš Hašan, Jian Yang, Ling-Qi Yan","doi":"10.1145/3588432.3591505","DOIUrl":null,"url":null,"abstract":"Bidirectional Texture Functions (BTFs) are able to represent complex materials with greater generality than traditional analytical models. This holds true for both measured real materials and synthetic ones. Recent advancements in neural BTF representations have significantly reduced storage costs, making them more practical for use in rendering. These representations typically combine spatial feature (latent) textures with neural decoders that handle angular dimensions per spatial location. However, these models have yet to combine fast compression and inference, accuracy, and generality. In this paper, we propose a biplane representation for BTFs, which uses a feature texture in the half-vector domain as well as the spatial domain. This allows the learned representation to encode high-frequency details in both the spatial and angular domains. Our decoder is small yet general, meaning it is trained once and fixed. Additionally, we optionally combine this representation with a neural offset module for parallax and masking effects. Our model can represent a broad range of BTFs and has fast compression and inference due to its lightweight architecture. Furthermore, it enables a simple way to capture BTF data. By taking about 20 cell phone photos with a collocated camera and flash, our model can plausibly recover the entire BTF, despite never observing function values with differing view and light directions. We demonstrate the effectiveness of our model in the acquisition of many measured materials, including challenging materials such as fabrics.","PeriodicalId":280036,"journal":{"name":"ACM SIGGRAPH 2023 Conference Proceedings","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Neural Biplane Representation for BTF Rendering and Acquisition\",\"authors\":\"Jiahui Fan, Beibei Wang, Miloš Hašan, Jian Yang, Ling-Qi Yan\",\"doi\":\"10.1145/3588432.3591505\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Bidirectional Texture Functions (BTFs) are able to represent complex materials with greater generality than traditional analytical models. This holds true for both measured real materials and synthetic ones. Recent advancements in neural BTF representations have significantly reduced storage costs, making them more practical for use in rendering. These representations typically combine spatial feature (latent) textures with neural decoders that handle angular dimensions per spatial location. However, these models have yet to combine fast compression and inference, accuracy, and generality. In this paper, we propose a biplane representation for BTFs, which uses a feature texture in the half-vector domain as well as the spatial domain. This allows the learned representation to encode high-frequency details in both the spatial and angular domains. Our decoder is small yet general, meaning it is trained once and fixed. Additionally, we optionally combine this representation with a neural offset module for parallax and masking effects. Our model can represent a broad range of BTFs and has fast compression and inference due to its lightweight architecture. Furthermore, it enables a simple way to capture BTF data. By taking about 20 cell phone photos with a collocated camera and flash, our model can plausibly recover the entire BTF, despite never observing function values with differing view and light directions. We demonstrate the effectiveness of our model in the acquisition of many measured materials, including challenging materials such as fabrics.\",\"PeriodicalId\":280036,\"journal\":{\"name\":\"ACM SIGGRAPH 2023 Conference Proceedings\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM SIGGRAPH 2023 Conference Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3588432.3591505\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM SIGGRAPH 2023 Conference Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3588432.3591505","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

与传统的分析模型相比,双向纹理函数(btf)具有更大的通用性。这对测量的真实材料和合成材料都成立。神经BTF表示的最新进展显著降低了存储成本,使其在渲染中更实用。这些表示通常将空间特征(潜在)纹理与处理每个空间位置的角度维度的神经解码器结合起来。然而,这些模型还没有将快速压缩和推理、准确性和通用性结合起来。在本文中,我们提出了一种双平面的btf表示方法,该方法在半向量域和空间域使用特征纹理。这允许学习表征在空间和角度域编码高频细节。我们的解码器很小但很通用,这意味着它只需要训练一次就可以固定。此外,我们可以选择将这种表示与视差和掩蔽效果的神经偏移模块结合起来。由于其轻量级的架构,我们的模型可以表示广泛的btf,并且具有快速的压缩和推理。此外,它还提供了一种捕获BTF数据的简单方法。在相机和闪光灯的搭配下,我们的模型拍摄了大约20张手机照片,尽管从未观察到不同视角和光线方向的功能值,但我们的模型似乎可以恢复整个BTF。我们证明了我们的模型在获取许多测量材料方面的有效性,包括具有挑战性的材料,如织物。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Neural Biplane Representation for BTF Rendering and Acquisition
Bidirectional Texture Functions (BTFs) are able to represent complex materials with greater generality than traditional analytical models. This holds true for both measured real materials and synthetic ones. Recent advancements in neural BTF representations have significantly reduced storage costs, making them more practical for use in rendering. These representations typically combine spatial feature (latent) textures with neural decoders that handle angular dimensions per spatial location. However, these models have yet to combine fast compression and inference, accuracy, and generality. In this paper, we propose a biplane representation for BTFs, which uses a feature texture in the half-vector domain as well as the spatial domain. This allows the learned representation to encode high-frequency details in both the spatial and angular domains. Our decoder is small yet general, meaning it is trained once and fixed. Additionally, we optionally combine this representation with a neural offset module for parallax and masking effects. Our model can represent a broad range of BTFs and has fast compression and inference due to its lightweight architecture. Furthermore, it enables a simple way to capture BTF data. By taking about 20 cell phone photos with a collocated camera and flash, our model can plausibly recover the entire BTF, despite never observing function values with differing view and light directions. We demonstrate the effectiveness of our model in the acquisition of many measured materials, including challenging materials such as fabrics.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信