{"title":"Segmentation and Quantification of Surface Defects in 3D Reconstructions for Damage Assessment and Inspection","authors":"Jonathan Sterckx;Michiel Vlaminck;Hiep Luong","doi":"10.1109/TASE.2025.3593967","DOIUrl":null,"url":null,"abstract":"Detecting surface defects is crucial for maintaining the integrity of critical infrastructure. Traditional RGB image-based methods are limited by their reliance on 2D information, which impairs accurate damage assessment. This paper introduces a novel approach that enhances defect detection and quantification, utilizing dense 3D reconstructions generated through techniques like photogrammetry or profilometry. We develop an improved robust spline fitting algorithm to estimate the undamaged surfaces from the 3D reconstructions. The residual distances between the observed and fitted surfaces are subsequently used to segment and quantify defects. By leveraging 3D data, our method resolves visual ambiguities and enables damage quantification using physically meaningful metrics. For 3D models based on optical sensing, our method complements RGB image-based defect detectors and classifiers, facilitating the fusion of visual and 3D information for a more comprehensive defect analysis. Validated on both synthetic and real-world datasets, our method demonstrates strong performance and practical feasibility. Note to Practitioners—Our research is driven by the growing potential of drone-based inspections using high-resolution imaging platforms, which offer significant advantages for monitoring remote or hard-to-reach infrastructure. With high-quality images, we can use photogrammetry to reconstruct accurate 3D models directly from inspection images, without requiring additional sensors or manual intervention. We leverage this 3D data to improve the robustness of automated defect detection and enable the precise quantification of defect sizes and material loss. Tracking these metrics over multiple inspections can provide valuable information for preventive and predictive maintenance, moving us closer to efficient, comprehensive structural health monitoring. While our method is highly effective across various surfaces, it is less suited to detecting widespread shallow damage, which could benefit from incorporating additional geometric constraints.","PeriodicalId":51060,"journal":{"name":"IEEE Transactions on Automation Science and Engineering","volume":"22 ","pages":"19439-19450"},"PeriodicalIF":6.4000,"publicationDate":"2025-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Automation Science and Engineering","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11104073/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Detecting surface defects is crucial for maintaining the integrity of critical infrastructure. Traditional RGB image-based methods are limited by their reliance on 2D information, which impairs accurate damage assessment. This paper introduces a novel approach that enhances defect detection and quantification, utilizing dense 3D reconstructions generated through techniques like photogrammetry or profilometry. We develop an improved robust spline fitting algorithm to estimate the undamaged surfaces from the 3D reconstructions. The residual distances between the observed and fitted surfaces are subsequently used to segment and quantify defects. By leveraging 3D data, our method resolves visual ambiguities and enables damage quantification using physically meaningful metrics. For 3D models based on optical sensing, our method complements RGB image-based defect detectors and classifiers, facilitating the fusion of visual and 3D information for a more comprehensive defect analysis. Validated on both synthetic and real-world datasets, our method demonstrates strong performance and practical feasibility. Note to Practitioners—Our research is driven by the growing potential of drone-based inspections using high-resolution imaging platforms, which offer significant advantages for monitoring remote or hard-to-reach infrastructure. With high-quality images, we can use photogrammetry to reconstruct accurate 3D models directly from inspection images, without requiring additional sensors or manual intervention. We leverage this 3D data to improve the robustness of automated defect detection and enable the precise quantification of defect sizes and material loss. Tracking these metrics over multiple inspections can provide valuable information for preventive and predictive maintenance, moving us closer to efficient, comprehensive structural health monitoring. While our method is highly effective across various surfaces, it is less suited to detecting widespread shallow damage, which could benefit from incorporating additional geometric constraints.
期刊介绍:
The IEEE Transactions on Automation Science and Engineering (T-ASE) publishes fundamental papers on Automation, emphasizing scientific results that advance efficiency, quality, productivity, and reliability. T-ASE encourages interdisciplinary approaches from computer science, control systems, electrical engineering, mathematics, mechanical engineering, operations research, and other fields. T-ASE welcomes results relevant to industries such as agriculture, biotechnology, healthcare, home automation, maintenance, manufacturing, pharmaceuticals, retail, security, service, supply chains, and transportation. T-ASE addresses a research community willing to integrate knowledge across disciplines and industries. For this purpose, each paper includes a Note to Practitioners that summarizes how its results can be applied or how they might be extended to apply in practice.