3D reconstruction in industrial environments based on an improved neural radiation field method research

Kaichen Zhou, Tianlun Huang, Weijun Wang, Haowen Luo, Fawad Khan, Ziqian Du, Wei Feng
{"title":"3D reconstruction in industrial environments based on an improved neural radiation field method research","authors":"Kaichen Zhou, Tianlun Huang, Weijun Wang, Haowen Luo, Fawad Khan, Ziqian Du, Wei Feng","doi":"10.1117/12.3031943","DOIUrl":null,"url":null,"abstract":"Neural Radiation Field (NeRF) is driving the development of 3D reconstruction technology. Several NeRF variants have been proposed to improve rendering accuracy and reconstruction speed. One of the most significant variants, TensoRF, uses a 4D tensor to model the radiation field, resulting in improved accuracy and speed. However, reconstruction quality remains limited. This study presents an improved TensoRF that addresses the aforementioned issues by reconstructing its multilayer perceptron network. Increasing the number of neurons in the input and network layers improves the render accuracy. To accelerate the reconstruction speed, we utilized the Nadam optimization algorithm and the RELU6 activation function. Our experiments on various classical datasets demonstrate that the PSNR value of the improved TensoRF is higher than that of the original TensoRF. Additionally, the improved TensoRF has a faster reconstruction speed (≤30min). Finally, we applied the improved TensoRF to a self-made industrial dataset. The results showed better global accuracy and local texture in the reconstructed image.","PeriodicalId":198425,"journal":{"name":"Other Conferences","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Other Conferences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.3031943","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Neural Radiation Field (NeRF) is driving the development of 3D reconstruction technology. Several NeRF variants have been proposed to improve rendering accuracy and reconstruction speed. One of the most significant variants, TensoRF, uses a 4D tensor to model the radiation field, resulting in improved accuracy and speed. However, reconstruction quality remains limited. This study presents an improved TensoRF that addresses the aforementioned issues by reconstructing its multilayer perceptron network. Increasing the number of neurons in the input and network layers improves the render accuracy. To accelerate the reconstruction speed, we utilized the Nadam optimization algorithm and the RELU6 activation function. Our experiments on various classical datasets demonstrate that the PSNR value of the improved TensoRF is higher than that of the original TensoRF. Additionally, the improved TensoRF has a faster reconstruction speed (≤30min). Finally, we applied the improved TensoRF to a self-made industrial dataset. The results showed better global accuracy and local texture in the reconstructed image.
基于改进的神经辐射场方法的工业环境三维重建研究
神经辐射场(NeRF)正在推动三维重建技术的发展。为了提高渲染精度和重建速度,已经提出了几种 NeRF 变体。其中最重要的变体之一是 TensoRF,它使用 4D 张量对辐射场进行建模,从而提高了精度和速度。然而,重建质量仍然有限。本研究提出了一种改进的 TensoRF,通过重建其多层感知器网络来解决上述问题。增加输入层和网络层的神经元数量可提高渲染精度。为了加快重构速度,我们采用了 Nadam 优化算法和 RELU6 激活函数。我们在各种经典数据集上的实验表明,改进后的 TensoRF 的 PSNR 值高于原始 TensoRF。此外,改进型 TensoRF 的重建速度更快(≤30 分钟)。最后,我们将改进后的 TensoRF 应用于自制的工业数据集。结果表明,重建图像的全局精度和局部纹理更好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信