Method for Automated Data Collection for 3D Reconstruction

M. Zaslavskiy, Roman Shestopalov, Alexander Grebenshchikov, Danil Korenev, Evgeny Shkvirya
{"title":"Method for Automated Data Collection for 3D Reconstruction","authors":"M. Zaslavskiy, Roman Shestopalov, Alexander Grebenshchikov, Danil Korenev, Evgeny Shkvirya","doi":"10.23919/FRUCT56874.2022.9953825","DOIUrl":null,"url":null,"abstract":"The paper presents an automated method to guide a human operator during RGB image shot for improving the quality of further 3D reconstruction for low-rise outdoor objects and a use case for method application. The method provides automation as a three-step process: local analysis of images performed during the shooting, global analysis, and recommendations performed after the shooting. Method steps filter out defective images, approximate future 3D-model using Structure-from-Motion (SfM), and map it to human operator trajectory estimation to identify object areas that will have low resolution on the final 3D model, requiring additional shooting. Method structure does not require any sensors except RGB camera and inertial sensors and does not rely on any external backend, which lowers the hardware requirement. The authors implemented the method and use-case as an Android application. The method was evaluated by experiments on outdoor datasets created for this study. The evaluation shows that the local analysis stage is fast enough to perform during the shooting process and that the local analysis stage improves the quality of the final 3D model.","PeriodicalId":274664,"journal":{"name":"2022 32nd Conference of Open Innovations Association (FRUCT)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 32nd Conference of Open Innovations Association (FRUCT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/FRUCT56874.2022.9953825","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

The paper presents an automated method to guide a human operator during RGB image shot for improving the quality of further 3D reconstruction for low-rise outdoor objects and a use case for method application. The method provides automation as a three-step process: local analysis of images performed during the shooting, global analysis, and recommendations performed after the shooting. Method steps filter out defective images, approximate future 3D-model using Structure-from-Motion (SfM), and map it to human operator trajectory estimation to identify object areas that will have low resolution on the final 3D model, requiring additional shooting. Method structure does not require any sensors except RGB camera and inertial sensors and does not rely on any external backend, which lowers the hardware requirement. The authors implemented the method and use-case as an Android application. The method was evaluated by experiments on outdoor datasets created for this study. The evaluation shows that the local analysis stage is fast enough to perform during the shooting process and that the local analysis stage improves the quality of the final 3D model.
三维重建的自动数据采集方法
本文提出了一种在RGB图像拍摄过程中自动引导操作人员的方法,以提高低层室外物体进一步三维重建的质量,并给出了方法应用的用例。该方法通过三个步骤提供自动化过程:拍摄期间执行的图像的本地分析,拍摄后执行的全局分析和建议。方法步骤过滤掉有缺陷的图像,使用运动结构(SfM)近似未来的3D模型,并将其映射到人类操作员轨迹估计中,以识别最终3D模型中分辨率较低的目标区域,需要额外的拍摄。方法结构不需要除RGB相机和惯性传感器外的任何传感器,也不依赖于任何外部后端,降低了硬件要求。作者将该方法和用例作为Android应用程序实现。该方法通过为本研究创建的室外数据集的实验进行了评估。评价表明,局部分析阶段在拍摄过程中执行速度足够快,局部分析阶段提高了最终3D模型的质量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信