Deep learning-based restoration method for missing fringe information in the high reflectivity regions

IF 3.7 2区 工程技术 Q2 ENGINEERING, MANUFACTURING
Longxiang Zhang, Yixin Ji, Wei Wu, Jianhua Wang
{"title":"Deep learning-based restoration method for missing fringe information in the high reflectivity regions","authors":"Longxiang Zhang,&nbsp;Yixin Ji,&nbsp;Wei Wu,&nbsp;Jianhua Wang","doi":"10.1016/j.precisioneng.2025.06.006","DOIUrl":null,"url":null,"abstract":"<div><div>Conventional fringe projection profilometry (FPP) based on a single-exposure is limited in achieving high-precision 3D measurements when processing fringe images that contain saturated regions, as distortion in these areas significantly reduces the measurement accuracy. Although high dynamic range (HDR) methods can mitigate this issue, they require additional conditions, which increase both measurement costs and complexity. To address the limitations of traditional HDR methods, a method based on deep learning is proposed for restoring missing fringe information. The proposed method employs a simple U-Net architecture to efficiently restore saturated fringe images by leveraging high-dimensional feature representations and skip connections within the network. In addition, combining the restored fringe information provided by the method presented enables 3D measurement of the object under different measurement systems. The results of the experiments confirm that the method accurately restores the fringe information in saturated images, making the restored images suitable for high-precision 3D reconstruction. Furthermore, by integrating the restored fringe information, the proposed method enables 3D reconstruction using saturated fringe images captured under different measurement systems, which include those with varying saturation levels and those captured from different angles. This demonstrates that the proposed method restores missing fringe information in saturated images and exhibits strong generalization capabilities.</div></div>","PeriodicalId":54589,"journal":{"name":"Precision Engineering-Journal of the International Societies for Precision Engineering and Nanotechnology","volume":"96 ","pages":"Pages 80-93"},"PeriodicalIF":3.7000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Precision Engineering-Journal of the International Societies for Precision Engineering and Nanotechnology","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0141635925001941","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MANUFACTURING","Score":null,"Total":0}
引用次数: 0

Abstract

Conventional fringe projection profilometry (FPP) based on a single-exposure is limited in achieving high-precision 3D measurements when processing fringe images that contain saturated regions, as distortion in these areas significantly reduces the measurement accuracy. Although high dynamic range (HDR) methods can mitigate this issue, they require additional conditions, which increase both measurement costs and complexity. To address the limitations of traditional HDR methods, a method based on deep learning is proposed for restoring missing fringe information. The proposed method employs a simple U-Net architecture to efficiently restore saturated fringe images by leveraging high-dimensional feature representations and skip connections within the network. In addition, combining the restored fringe information provided by the method presented enables 3D measurement of the object under different measurement systems. The results of the experiments confirm that the method accurately restores the fringe information in saturated images, making the restored images suitable for high-precision 3D reconstruction. Furthermore, by integrating the restored fringe information, the proposed method enables 3D reconstruction using saturated fringe images captured under different measurement systems, which include those with varying saturation levels and those captured from different angles. This demonstrates that the proposed method restores missing fringe information in saturated images and exhibits strong generalization capabilities.
基于深度学习的高反射率区缺失条纹信息恢复方法
当处理包含饱和区域的条纹图像时,基于单次曝光的传统条纹投影轮廓术(FPP)在实现高精度3D测量方面受到限制,因为这些区域的失真大大降低了测量精度。虽然高动态范围(HDR)方法可以缓解这个问题,但它们需要额外的条件,这增加了测量成本和复杂性。针对传统HDR方法的局限性,提出了一种基于深度学习的边缘缺失信息恢复方法。该方法采用简单的U-Net结构,利用高维特征表示,跳过网络内的连接,有效地恢复饱和条纹图像。此外,结合该方法提供的恢复条纹信息,可以在不同的测量系统下对目标进行三维测量。实验结果表明,该方法能准确地恢复饱和图像中的条纹信息,使恢复后的图像适合高精度的三维重建。此外,该方法通过整合恢复的条纹信息,实现了在不同测量系统下捕获的饱和条纹图像(包括不同饱和度水平和不同角度捕获的条纹图像)的三维重建。结果表明,该方法能够恢复饱和图像中缺失的条纹信息,具有较强的泛化能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.40
自引率
5.60%
发文量
177
审稿时长
46 days
期刊介绍: Precision Engineering - Journal of the International Societies for Precision Engineering and Nanotechnology is devoted to the multidisciplinary study and practice of high accuracy engineering, metrology, and manufacturing. The journal takes an integrated approach to all subjects related to research, design, manufacture, performance validation, and application of high precision machines, instruments, and components, including fundamental and applied research and development in manufacturing processes, fabrication technology, and advanced measurement science. The scope includes precision-engineered systems and supporting metrology over the full range of length scales, from atom-based nanotechnology and advanced lithographic technology to large-scale systems, including optical and radio telescopes and macrometrology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信