用于可扩展场景重建的神经场

J. Tompkin
{"title":"用于可扩展场景重建的神经场","authors":"J. Tompkin","doi":"10.47330/dcio.2022.axbl8798","DOIUrl":null,"url":null,"abstract":"Neural fields are a new (and old!) approach to solving problems over spacetime via first-order optimization of a neural network. Over the past three years, combining neural fields with classic computer graphics approaches have allowed us to make significant advances in solving computer vision problems like scene reconstruction. I will present recent work that can reconstruct indoor scenes for photorealistic interactive exploration using new scalable hybrid neural field representations. This has applications where any real-world place needs to be digitized, especially for visualization purposes.","PeriodicalId":129906,"journal":{"name":"Design Computation Input/Output 2022","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Neural Fields for Scalable Scene Reconstruction\",\"authors\":\"J. Tompkin\",\"doi\":\"10.47330/dcio.2022.axbl8798\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Neural fields are a new (and old!) approach to solving problems over spacetime via first-order optimization of a neural network. Over the past three years, combining neural fields with classic computer graphics approaches have allowed us to make significant advances in solving computer vision problems like scene reconstruction. I will present recent work that can reconstruct indoor scenes for photorealistic interactive exploration using new scalable hybrid neural field representations. This has applications where any real-world place needs to be digitized, especially for visualization purposes.\",\"PeriodicalId\":129906,\"journal\":{\"name\":\"Design Computation Input/Output 2022\",\"volume\":\"49 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Design Computation Input/Output 2022\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.47330/dcio.2022.axbl8798\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Design Computation Input/Output 2022","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.47330/dcio.2022.axbl8798","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

神经场是一种新的(也是古老的)方法,通过神经网络的一阶优化来解决时空问题。在过去的三年中,将神经领域与经典计算机图形学方法相结合,使我们在解决场景重建等计算机视觉问题方面取得了重大进展。我将介绍最近的工作,可以重建室内场景,使用新的可扩展混合神经场表示进行逼真的互动探索。这在任何需要数字化的现实世界的地方都有应用,特别是为了可视化的目的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Neural Fields for Scalable Scene Reconstruction
Neural fields are a new (and old!) approach to solving problems over spacetime via first-order optimization of a neural network. Over the past three years, combining neural fields with classic computer graphics approaches have allowed us to make significant advances in solving computer vision problems like scene reconstruction. I will present recent work that can reconstruct indoor scenes for photorealistic interactive exploration using new scalable hybrid neural field representations. This has applications where any real-world place needs to be digitized, especially for visualization purposes.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信