Deep-learning-based longitudinal joint opening detection method for metro shield tunnel

IF 6.7 1区 工程技术 Q1 CONSTRUCTION & BUILDING TECHNOLOGY
Anbin Yu , Wensheng Mei
{"title":"Deep-learning-based longitudinal joint opening detection method for metro shield tunnel","authors":"Anbin Yu ,&nbsp;Wensheng Mei","doi":"10.1016/j.tust.2024.106108","DOIUrl":null,"url":null,"abstract":"<div><div>In this paper, a longitudinal joint opening detection method using a precise longitudinal segment joint extraction algorithm featuring deep neural networks (DNNs) is proposed. The proposed method consists of the following four steps. First, a mobile scanning system is employed to obtain three-dimensional metro shield tunnel point clouds. Then, two small DNNs, YOLOv5 and JLNet, were designed to accurately extract the longitudinal segment joint lines from the images generated from the scanned point clouds. YOLOv5 rapidly detects the approximate longitudinal segment joint areas, while JLNet precisely fits the joint lines. Subsequently, using the extracted segment joint lines, the points associated with different tunnel segments can be segmented accordingly. Finally, based on the tunnel segment point clouds, a joint opening angle calculation method that combines the cylinder projection and plane-fitting algorithms is proposed. Experimental results demonstrate that the proposed DNN-based method can accurately extract segment joint lines without being influenced by the tunnel equipment around the segment joints. The YOLOv5 network exhibited a classification accuracy of 0.9907 and a bounding box prediction error of 0.004. For the JLNet network, the line slope prediction error was 0.0072, with an intercept error of 1.53 pixels. The joint opening spatial distribution pattern was identified by comparing the joint opening angles in the deformed and undeformed tunnels. Additionally, the accuracy of the proposed method was evaluated, revealing that the joint opening angle detection external accuracy was 0.13°.</div></div>","PeriodicalId":49414,"journal":{"name":"Tunnelling and Underground Space Technology","volume":"154 ","pages":"Article 106108"},"PeriodicalIF":6.7000,"publicationDate":"2024-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tunnelling and Underground Space Technology","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0886779824005261","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CONSTRUCTION & BUILDING TECHNOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, a longitudinal joint opening detection method using a precise longitudinal segment joint extraction algorithm featuring deep neural networks (DNNs) is proposed. The proposed method consists of the following four steps. First, a mobile scanning system is employed to obtain three-dimensional metro shield tunnel point clouds. Then, two small DNNs, YOLOv5 and JLNet, were designed to accurately extract the longitudinal segment joint lines from the images generated from the scanned point clouds. YOLOv5 rapidly detects the approximate longitudinal segment joint areas, while JLNet precisely fits the joint lines. Subsequently, using the extracted segment joint lines, the points associated with different tunnel segments can be segmented accordingly. Finally, based on the tunnel segment point clouds, a joint opening angle calculation method that combines the cylinder projection and plane-fitting algorithms is proposed. Experimental results demonstrate that the proposed DNN-based method can accurately extract segment joint lines without being influenced by the tunnel equipment around the segment joints. The YOLOv5 network exhibited a classification accuracy of 0.9907 and a bounding box prediction error of 0.004. For the JLNet network, the line slope prediction error was 0.0072, with an intercept error of 1.53 pixels. The joint opening spatial distribution pattern was identified by comparing the joint opening angles in the deformed and undeformed tunnels. Additionally, the accuracy of the proposed method was evaluated, revealing that the joint opening angle detection external accuracy was 0.13°.
基于深度学习的地铁盾构隧道纵向接头开口检测方法
本文提出了一种纵向关节开口检测方法,该方法采用以深度神经网络(DNN)为特征的精确纵向节段关节提取算法。该方法包括以下四个步骤。首先,采用移动扫描系统获取三维地铁盾构隧道点云。然后,设计了两个小型 DNN(YOLOv5 和 JLNet),用于从扫描点云生成的图像中精确提取纵向节段连接线。YOLOv5 能快速检测出近似的纵向节段连接区域,而 JLNet 则能精确拟合连接线。随后,利用提取的节段连接线,可以相应地分割出与不同隧道节段相关的点。最后,基于隧道段点云,提出了一种结合圆柱体投影和平面拟合算法的接头开口角计算方法。实验结果表明,所提出的基于 DNN 的方法可以准确提取分段连接线,而不受分段连接线周围隧道设备的影响。YOLOv5 网络的分类精度为 0.9907,边界框预测误差为 0.004。JLNet 网络的线路斜率预测误差为 0.0072,截距误差为 1.53 像素。通过比较变形隧道和未变形隧道的接头开口角度,确定了接头开口的空间分布模式。此外,还对所提方法的精度进行了评估,结果表明接头开口角度检测的外部精度为 0.13°。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Tunnelling and Underground Space Technology
Tunnelling and Underground Space Technology 工程技术-工程:土木
CiteScore
11.90
自引率
18.80%
发文量
454
审稿时长
10.8 months
期刊介绍: Tunnelling and Underground Space Technology is an international journal which publishes authoritative articles encompassing the development of innovative uses of underground space and the results of high quality research into improved, more cost-effective techniques for the planning, geo-investigation, design, construction, operation and maintenance of underground and earth-sheltered structures. The journal provides an effective vehicle for the improved worldwide exchange of information on developments in underground technology - and the experience gained from its use - and is strongly committed to publishing papers on the interdisciplinary aspects of creating, planning, and regulating underground space.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信