从点云生成室内 3D 数字模型的新框架

IF 4.2 2区 地球科学 Q2 ENVIRONMENTAL SCIENCES
Remote Sensing Pub Date : 2024-09-18 DOI:10.3390/rs16183462
Xiang Gao, Ronghao Yang, Xuewen Chen, Junxiang Tan, Yan Liu, Zhaohua Wang, Jiahao Tan, Huan Liu
{"title":"从点云生成室内 3D 数字模型的新框架","authors":"Xiang Gao, Ronghao Yang, Xuewen Chen, Junxiang Tan, Yan Liu, Zhaohua Wang, Jiahao Tan, Huan Liu","doi":"10.3390/rs16183462","DOIUrl":null,"url":null,"abstract":"Three-dimensional indoor models have wide applications in fields such as indoor navigation, civil engineering, virtual reality, and so on. With the development of LiDAR technology, automatic reconstruction of indoor models from point clouds has gained significant attention. We propose a new framework for generating indoor 3D digital models from point clouds. The proposed method first generates a room instance map of an indoor scene. Walls are detected and projected onto a horizontal plane to form line segments. These segments are extended, intersected, and, by solving an integer programming problem, line segments are selected to create room polygons. The polygons are converted into a raster image, and image connectivity detection is used to generate a room instance map. Then the roofs of the point cloud are extracted and used to perform an overlap analysis with the generated room instance map to segment the entire roof point cloud, obtaining the roof for each room. Room boundaries are defined by extracting and regularizing the roof point cloud boundaries. Finally, by detecting doors and windows in the scene in two steps, we generate the floor plans and 3D models separately. Experiments with the Giblayout dataset show that our method is robust to clutter and furniture point clouds, achieving high-accuracy models that match real scenes. The mean precision and recall for the floorplans are both 0.93, and the Point–Surface Distance (PSD) and standard deviation of the PSD for the 3D models are 0.044 m and 0.066 m, respectively.","PeriodicalId":48993,"journal":{"name":"Remote Sensing","volume":"166 1","pages":""},"PeriodicalIF":4.2000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A New Framework for Generating Indoor 3D Digital Models from Point Clouds\",\"authors\":\"Xiang Gao, Ronghao Yang, Xuewen Chen, Junxiang Tan, Yan Liu, Zhaohua Wang, Jiahao Tan, Huan Liu\",\"doi\":\"10.3390/rs16183462\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Three-dimensional indoor models have wide applications in fields such as indoor navigation, civil engineering, virtual reality, and so on. With the development of LiDAR technology, automatic reconstruction of indoor models from point clouds has gained significant attention. We propose a new framework for generating indoor 3D digital models from point clouds. The proposed method first generates a room instance map of an indoor scene. Walls are detected and projected onto a horizontal plane to form line segments. These segments are extended, intersected, and, by solving an integer programming problem, line segments are selected to create room polygons. The polygons are converted into a raster image, and image connectivity detection is used to generate a room instance map. Then the roofs of the point cloud are extracted and used to perform an overlap analysis with the generated room instance map to segment the entire roof point cloud, obtaining the roof for each room. Room boundaries are defined by extracting and regularizing the roof point cloud boundaries. Finally, by detecting doors and windows in the scene in two steps, we generate the floor plans and 3D models separately. Experiments with the Giblayout dataset show that our method is robust to clutter and furniture point clouds, achieving high-accuracy models that match real scenes. The mean precision and recall for the floorplans are both 0.93, and the Point–Surface Distance (PSD) and standard deviation of the PSD for the 3D models are 0.044 m and 0.066 m, respectively.\",\"PeriodicalId\":48993,\"journal\":{\"name\":\"Remote Sensing\",\"volume\":\"166 1\",\"pages\":\"\"},\"PeriodicalIF\":4.2000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.3390/rs16183462\",\"RegionNum\":2,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/rs16183462","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

三维室内模型在室内导航、土木工程、虚拟现实等领域有着广泛的应用。随着激光雷达技术的发展,从点云自动重建室内模型已受到广泛关注。我们提出了一种从点云生成室内三维数字模型的新框架。该方法首先生成室内场景的房间实例图。检测墙壁并将其投影到水平面上,形成线段。这些线段被延长、相交,并通过解决整数编程问题,选择线段创建房间多边形。多边形被转换成光栅图像,并通过图像连通性检测生成房间实例图。然后提取点云中的屋顶,与生成的房间实例图进行重叠分析,分割整个屋顶点云,获得每个房间的屋顶。通过提取和规范化屋顶点云边界,确定房间边界。最后,通过分两步检测场景中的门窗,我们分别生成了平面图和三维模型。使用 Giblayout 数据集进行的实验表明,我们的方法对杂波和家具点云具有很强的鲁棒性,能生成与真实场景相匹配的高精度模型。平面图的平均精确度和召回率均为 0.93,三维模型的点面距离(PSD)和 PSD 标准偏差分别为 0.044 米和 0.066 米。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A New Framework for Generating Indoor 3D Digital Models from Point Clouds
Three-dimensional indoor models have wide applications in fields such as indoor navigation, civil engineering, virtual reality, and so on. With the development of LiDAR technology, automatic reconstruction of indoor models from point clouds has gained significant attention. We propose a new framework for generating indoor 3D digital models from point clouds. The proposed method first generates a room instance map of an indoor scene. Walls are detected and projected onto a horizontal plane to form line segments. These segments are extended, intersected, and, by solving an integer programming problem, line segments are selected to create room polygons. The polygons are converted into a raster image, and image connectivity detection is used to generate a room instance map. Then the roofs of the point cloud are extracted and used to perform an overlap analysis with the generated room instance map to segment the entire roof point cloud, obtaining the roof for each room. Room boundaries are defined by extracting and regularizing the roof point cloud boundaries. Finally, by detecting doors and windows in the scene in two steps, we generate the floor plans and 3D models separately. Experiments with the Giblayout dataset show that our method is robust to clutter and furniture point clouds, achieving high-accuracy models that match real scenes. The mean precision and recall for the floorplans are both 0.93, and the Point–Surface Distance (PSD) and standard deviation of the PSD for the 3D models are 0.044 m and 0.066 m, respectively.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Remote Sensing
Remote Sensing REMOTE SENSING-
CiteScore
8.30
自引率
24.00%
发文量
5435
审稿时长
20.66 days
期刊介绍: Remote Sensing (ISSN 2072-4292) publishes regular research papers, reviews, letters and communications covering all aspects of the remote sensing process, from instrument design and signal processing to the retrieval of geophysical parameters and their application in geosciences. Our aim is to encourage scientists to publish experimental, theoretical and computational results in as much detail as possible so that results can be easily reproduced. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信