Tomographic Sparse View Selection Using the View Covariance Loss.

IF 18.6
Jingsong Lin, Amirkoushyar Ziabari, Singanallur V Venkatakrishnan, Obaidullah Rahman, Gregery T Buzzard, Charles A Bouman
{"title":"Tomographic Sparse View Selection Using the View Covariance Loss.","authors":"Jingsong Lin, Amirkoushyar Ziabari, Singanallur V Venkatakrishnan, Obaidullah Rahman, Gregery T Buzzard, Charles A Bouman","doi":"10.1109/TPAMI.2025.3600072","DOIUrl":null,"url":null,"abstract":"<p><p>Standard computed tomography (CT) reconstruction algorithms such as filtered back projection (FBP) and Feldkamp-Davis-Kress (FDK) require many views for producing high-quality reconstructions, which can slow image acquisition and increase cost in non-destructive evaluation (NDE) applications. Over the past 20 years, a variety of methods have been developed for computing high-quality CT reconstructions from sparse views. However, the problem of how to select the best views for CT reconstruction remains open. In this paper, we present a novel view covariance loss (VCL) function that measures the joint information of a set of views by approximating the normalized mean squared error (NMSE) of the reconstruction. We present fast algorithms for computing the VCL along with an algorithm for selecting a subset of views that approximately minimizes its value. Our experiments on simulated and measured data indicate that for a fixed number of views our proposed view covariance loss selection (VCLS) algorithm results in reconstructions with lower NRMSE, fewer artifacts, and greater accuracy than current alternative approaches.</p>","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":"PP ","pages":""},"PeriodicalIF":18.6000,"publicationDate":"2025-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TPAMI.2025.3600072","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Standard computed tomography (CT) reconstruction algorithms such as filtered back projection (FBP) and Feldkamp-Davis-Kress (FDK) require many views for producing high-quality reconstructions, which can slow image acquisition and increase cost in non-destructive evaluation (NDE) applications. Over the past 20 years, a variety of methods have been developed for computing high-quality CT reconstructions from sparse views. However, the problem of how to select the best views for CT reconstruction remains open. In this paper, we present a novel view covariance loss (VCL) function that measures the joint information of a set of views by approximating the normalized mean squared error (NMSE) of the reconstruction. We present fast algorithms for computing the VCL along with an algorithm for selecting a subset of views that approximately minimizes its value. Our experiments on simulated and measured data indicate that for a fixed number of views our proposed view covariance loss selection (VCLS) algorithm results in reconstructions with lower NRMSE, fewer artifacts, and greater accuracy than current alternative approaches.

基于视图协方差损失的层析稀疏视图选择。
标准的计算机断层扫描(CT)重建算法,如滤波后投影(FBP)和Feldkamp-Davis-Kress (FDK)需要许多视图来产生高质量的重建,这可能会减慢图像采集速度并增加无损评估(NDE)应用的成本。在过去的20年中,已经开发了各种方法来计算稀疏视图的高质量CT重建。然而,如何选择CT重建的最佳视图仍然是一个悬而未决的问题。在本文中,我们提出了一种新的视图协方差损失(VCL)函数,该函数通过近似重建的归一化均方误差(NMSE)来度量一组视图的联合信息。我们提出了用于计算VCL的快速算法,以及用于选择视图子集以使其值近似最小化的算法。我们在模拟和测量数据上的实验表明,对于固定数量的视图,我们提出的视图协方差损失选择(VCLS)算法的重建结果具有更低的NRMSE,更少的人工产物,并且比当前的替代方法具有更高的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信