Plug-and-Play with 2.5D Artifact Reduction Prior for Fast and Accurate Industrial Computed Tomography Reconstruction

IF 2.4 3区 材料科学 Q2 MATERIALS SCIENCE, CHARACTERIZATION & TESTING
Haley Duba-Sullivan, Aniket Pramanik, Venkatakrishnan Singanallur, Amirkoushyar Ziabari
{"title":"Plug-and-Play with 2.5D Artifact Reduction Prior for Fast and Accurate Industrial Computed Tomography Reconstruction","authors":"Haley Duba-Sullivan,&nbsp;Aniket Pramanik,&nbsp;Venkatakrishnan Singanallur,&nbsp;Amirkoushyar Ziabari","doi":"10.1007/s10921-025-01239-0","DOIUrl":null,"url":null,"abstract":"<div><p>Cone-beam X-ray computed tomography (XCT) is an essential imaging technique for generating 3D reconstructions of internal structures, with applications ranging from medical to industrial imaging. Producing high-quality reconstructions typically requires many X-ray measurements; this process can be slow and expensive, especially for dense materials. Recent work incorporating artifact reduction priors within a plug-and-play (PnP) reconstruction framework has shown promising results in improving image quality from sparse-view XCT scans while enhancing the generalizability of deep learning-based solutions. However, this method uses a 2D convolutional neural network (CNN) for artifact reduction, which captures only slice-independent information from the 3D reconstruction, limiting performance. In this paper, we propose a PnP reconstruction method that uses a 2.5D artifact reduction CNN as the prior. This approach leverages inter-slice information from adjacent slices, capturing richer spatial context while remaining computationally efficient. We show that this 2.5D prior not only improves the quality of reconstructions but also enables the model to directly suppress commonly occurring XCT artifacts (such as beam hardening), eliminating the need for artifact correction pre-processing. Experiments on both experimental and synthetic cone-beam XCT data demonstrate that the proposed method better preserves fine structural details, such as pore size and shape, leading to more accurate defect detection compared to 2D priors. In particular, we demonstrate strong performance on experimental XCT data using a 2.5D artifact reduction prior trained entirely on simulated scans, highlighting the proposed method’s ability to generalize across domains.</p></div>","PeriodicalId":655,"journal":{"name":"Journal of Nondestructive Evaluation","volume":"44 4","pages":""},"PeriodicalIF":2.4000,"publicationDate":"2025-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10921-025-01239-0.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Nondestructive Evaluation","FirstCategoryId":"88","ListUrlMain":"https://link.springer.com/article/10.1007/s10921-025-01239-0","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, CHARACTERIZATION & TESTING","Score":null,"Total":0}
引用次数: 0

Abstract

Cone-beam X-ray computed tomography (XCT) is an essential imaging technique for generating 3D reconstructions of internal structures, with applications ranging from medical to industrial imaging. Producing high-quality reconstructions typically requires many X-ray measurements; this process can be slow and expensive, especially for dense materials. Recent work incorporating artifact reduction priors within a plug-and-play (PnP) reconstruction framework has shown promising results in improving image quality from sparse-view XCT scans while enhancing the generalizability of deep learning-based solutions. However, this method uses a 2D convolutional neural network (CNN) for artifact reduction, which captures only slice-independent information from the 3D reconstruction, limiting performance. In this paper, we propose a PnP reconstruction method that uses a 2.5D artifact reduction CNN as the prior. This approach leverages inter-slice information from adjacent slices, capturing richer spatial context while remaining computationally efficient. We show that this 2.5D prior not only improves the quality of reconstructions but also enables the model to directly suppress commonly occurring XCT artifacts (such as beam hardening), eliminating the need for artifact correction pre-processing. Experiments on both experimental and synthetic cone-beam XCT data demonstrate that the proposed method better preserves fine structural details, such as pore size and shape, leading to more accurate defect detection compared to 2D priors. In particular, we demonstrate strong performance on experimental XCT data using a 2.5D artifact reduction prior trained entirely on simulated scans, highlighting the proposed method’s ability to generalize across domains.

即插即用的2.5D伪影减少先验快速和准确的工业计算机断层扫描重建
锥束x射线计算机断层扫描(XCT)是生成内部结构三维重建的基本成像技术,应用范围从医学到工业成像。产生高质量的重建通常需要多次x射线测量;这个过程可能是缓慢和昂贵的,特别是对于致密的材料。最近在即插即用(PnP)重建框架中结合伪影减少先验的工作在提高稀疏视图XCT扫描的图像质量方面显示出了有希望的结果,同时增强了基于深度学习的解决方案的通用性。然而,该方法使用二维卷积神经网络(CNN)来减少伪影,它只能从3D重建中捕获与切片无关的信息,从而限制了性能。在本文中,我们提出了一种使用2.5D伪影还原CNN作为先验的PnP重建方法。这种方法利用来自相邻切片的片间信息,在保持计算效率的同时捕获更丰富的空间上下文。我们发现,这种2.5D先验不仅提高了重建的质量,而且使模型能够直接抑制常见的XCT伪影(如光束硬化),从而消除了伪影校正预处理的需要。在实验和合成锥梁XCT数据上的实验表明,该方法能更好地保留孔隙大小和形状等精细结构细节,从而比2D方法更准确地检测出缺陷。特别是,我们在实验XCT数据上展示了强大的性能,使用完全在模拟扫描上训练的2.5D伪迹减少先验,突出了所提出的方法跨域泛化的能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Nondestructive Evaluation
Journal of Nondestructive Evaluation 工程技术-材料科学:表征与测试
CiteScore
4.90
自引率
7.10%
发文量
67
审稿时长
9 months
期刊介绍: Journal of Nondestructive Evaluation provides a forum for the broad range of scientific and engineering activities involved in developing a quantitative nondestructive evaluation (NDE) capability. This interdisciplinary journal publishes papers on the development of new equipment, analyses, and approaches to nondestructive measurements.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信