3DGEN: a framework for generating custom-made synthetic 3D datasets for civil structure health monitoring

Yanda Shao, Ling Li, Jun Li, Qilin Li, Senjian An, Hong Hao
{"title":"3DGEN: a framework for generating custom-made synthetic 3D datasets for civil structure health monitoring","authors":"Yanda Shao, Ling Li, Jun Li, Qilin Li, Senjian An, Hong Hao","doi":"10.1177/14759217241265540","DOIUrl":null,"url":null,"abstract":"The availability of high-quality datasets is increasingly critical in the field of computer vision-based civil structural health monitoring, where deep learning approaches have gained prominence. However, the lack of specialized datasets for such tasks poses a significant challenge for training a reliable model. To address this challenge, a framework, 3DGEN, is proposed to swiftly generate realistic synthetic 3D datasets which can be targeted for specific tasks. The framework is based on diverse 3D civil structural models, rendering them from various angles and providing depth information and camera parameters for training neural networks. By employing mathematical methods, such as analytical solutions and/or numerical simulations, deformation of civil engineering structures can be generated, ensuring a reliable representation of their real-world shapes and characteristics in the 3D datasets. For texture generation, a generative 3D texturing method enables users to specify desired textures using plain English sentences. Two successful experiments are conducted to (1) assess the efficiency of generating the 3D datasets using two distinct structures, (2) train a monocular depth estimation network to perform 3D surface reconstruction with the generated dataset. Notably, 3DGEN is not limited to 3D surface reconstruction; it can also be used for training neural networks for various other tasks. The code and dataset are available at: https://github.com/YANDA-SHAO/Beam-Dataset-SE","PeriodicalId":515545,"journal":{"name":"Structural Health Monitoring","volume":"4 11","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Structural Health Monitoring","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/14759217241265540","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The availability of high-quality datasets is increasingly critical in the field of computer vision-based civil structural health monitoring, where deep learning approaches have gained prominence. However, the lack of specialized datasets for such tasks poses a significant challenge for training a reliable model. To address this challenge, a framework, 3DGEN, is proposed to swiftly generate realistic synthetic 3D datasets which can be targeted for specific tasks. The framework is based on diverse 3D civil structural models, rendering them from various angles and providing depth information and camera parameters for training neural networks. By employing mathematical methods, such as analytical solutions and/or numerical simulations, deformation of civil engineering structures can be generated, ensuring a reliable representation of their real-world shapes and characteristics in the 3D datasets. For texture generation, a generative 3D texturing method enables users to specify desired textures using plain English sentences. Two successful experiments are conducted to (1) assess the efficiency of generating the 3D datasets using two distinct structures, (2) train a monocular depth estimation network to perform 3D surface reconstruction with the generated dataset. Notably, 3DGEN is not limited to 3D surface reconstruction; it can also be used for training neural networks for various other tasks. The code and dataset are available at: https://github.com/YANDA-SHAO/Beam-Dataset-SE
3DGEN:为民用结构健康监测生成定制合成三维数据集的框架
在基于计算机视觉的土木工程结构健康监测领域,高质量数据集的可用性日益重要,深度学习方法在该领域的地位日益突出。然而,此类任务缺乏专门的数据集,这对训练可靠的模型构成了巨大挑战。为了应对这一挑战,我们提出了一个名为 3DGEN 的框架,用于快速生成可用于特定任务的真实合成三维数据集。该框架以各种三维土木结构模型为基础,从不同角度对其进行渲染,并为训练神经网络提供深度信息和相机参数。通过采用数学方法,如分析求解和/或数值模拟,可以生成土木工程结构的变形,确保在三维数据集中可靠地呈现其真实世界的形状和特征。在纹理生成方面,一种生成三维纹理的方法可让用户使用简单的英语句子指定所需的纹理。我们进行了两次成功的实验:(1) 评估使用两种不同结构生成三维数据集的效率;(2) 训练单目深度估算网络使用生成的数据集执行三维表面重建。值得注意的是,3DGEN 并不局限于三维表面重建;它还可用于训练神经网络完成其他各种任务。代码和数据集可在以下网址获取: https://github.com/YANDA-SHAO/Beam-Dataset-SE
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信