Uncertainty quantification for CT dosimetry based on 10 281 subjects using automatic image segmentation and fast Monte Carlo calculations.

Medical physics Pub Date : 2025-04-01 DOI:10.1002/mp.17796
Zirui Ye, Bei Yao, Haoran Zheng, Li Tao, Ripeng Wang, Yankui Chang, Zhi Chen, Yingming Zhao, Wei Wei, Xie George Xu
{"title":"Uncertainty quantification for CT dosimetry based on 10 281 subjects using automatic image segmentation and fast Monte Carlo calculations.","authors":"Zirui Ye, Bei Yao, Haoran Zheng, Li Tao, Ripeng Wang, Yankui Chang, Zhi Chen, Yingming Zhao, Wei Wei, Xie George Xu","doi":"10.1002/mp.17796","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Computed tomography (CT) scans are a major source of medical radiation exposure worldwide. In countries like China, the frequency of CT scans has grown rapidly, thus making available a large volume of organ dose information. With modern computational methods, we are now able to overcome challenges in automatic organ segmentation and rapid Monte Carlo (MC) dose calculations. We hypothesize that it is possible to process an extremely large number of patient-specific organ dose datasets in order to quantify and understand the range of CT dose uncertainties associated with inter-individual variability.</p><p><strong>Purpose: </strong>In this paper, we present a novel method that combines automatic image segmentation with GPU-accelerated MC simulations to reconstruct patient-specific organ doses for a large cohort of 10 281 individuals (6419 males and 3862 females) who underwent CT examinations at a Chinese hospital. Through data mining and comparison, we analyze organ dose distribution patterns to investigate possible uncertainty in CT dosimetry methods that rely on simplified phantoms population-averaged patient models.</p><p><strong>Methods: </strong>Our data-processing workflow involved three key steps. First, we collected and anonymized CT images and subjects' health metrics (age, sex, height, and weight) from the hospital's database. Second, we utilized a deep learning-based segmentation tool, DeepContour, to automatically delineate organs from the CT images, and then performed GPU-accelerated MC organ dose calculations using a validated GE scanner model and the ARCEHR-CT software. Finally, we conducted a comprehensive statistical analysis of doses for eight organs: lungs, heart, breasts, esophagus, stomach, liver, pancreas, and spleen.</p><p><strong>Results: </strong>It took 16 days to process data for the entire cohorts-at a speed of 600 individual CT dose datasets per day-using a single NVIDIA RTX 3080 GPU card. The results show profound inter-individual variability in organ doses, even when only comparing subjects having similar body mass index (BMI) or water equivalent diameter (WED). Statistical analyses indicate that the data fitting-a method often used in analyzing the trend in CT dosimetry-can lead to relative errors exceeding as much as 50% for the data studied for this cohort. Statistical analyses also reveal quantitative correlations between organ doses and health metrics, including weight, BMI, WED, and size-specific dose estimate (SSDE), suggesting that these factors may still serve as surrogates for indirect dose estimation as long as the uncertainty is fully understood and tolerable. Interestingly, the CT scanner's tube current modulation reduces the average organ doses for the cohort as expected, but the individual organ dose variability remains similar to those from scans having a fixed tube current.</p><p><strong>Conclusions: </strong>Using newly available computational tools, this study has demonstrated the feasibility of conducting big data analysis towards CT dose data mining and uncertainty quantification. Results show that inter-individual variability is significant and can be taken into account in an effort to improve CT dosimetry.</p>","PeriodicalId":94136,"journal":{"name":"Medical physics","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/mp.17796","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Computed tomography (CT) scans are a major source of medical radiation exposure worldwide. In countries like China, the frequency of CT scans has grown rapidly, thus making available a large volume of organ dose information. With modern computational methods, we are now able to overcome challenges in automatic organ segmentation and rapid Monte Carlo (MC) dose calculations. We hypothesize that it is possible to process an extremely large number of patient-specific organ dose datasets in order to quantify and understand the range of CT dose uncertainties associated with inter-individual variability.

Purpose: In this paper, we present a novel method that combines automatic image segmentation with GPU-accelerated MC simulations to reconstruct patient-specific organ doses for a large cohort of 10 281 individuals (6419 males and 3862 females) who underwent CT examinations at a Chinese hospital. Through data mining and comparison, we analyze organ dose distribution patterns to investigate possible uncertainty in CT dosimetry methods that rely on simplified phantoms population-averaged patient models.

Methods: Our data-processing workflow involved three key steps. First, we collected and anonymized CT images and subjects' health metrics (age, sex, height, and weight) from the hospital's database. Second, we utilized a deep learning-based segmentation tool, DeepContour, to automatically delineate organs from the CT images, and then performed GPU-accelerated MC organ dose calculations using a validated GE scanner model and the ARCEHR-CT software. Finally, we conducted a comprehensive statistical analysis of doses for eight organs: lungs, heart, breasts, esophagus, stomach, liver, pancreas, and spleen.

Results: It took 16 days to process data for the entire cohorts-at a speed of 600 individual CT dose datasets per day-using a single NVIDIA RTX 3080 GPU card. The results show profound inter-individual variability in organ doses, even when only comparing subjects having similar body mass index (BMI) or water equivalent diameter (WED). Statistical analyses indicate that the data fitting-a method often used in analyzing the trend in CT dosimetry-can lead to relative errors exceeding as much as 50% for the data studied for this cohort. Statistical analyses also reveal quantitative correlations between organ doses and health metrics, including weight, BMI, WED, and size-specific dose estimate (SSDE), suggesting that these factors may still serve as surrogates for indirect dose estimation as long as the uncertainty is fully understood and tolerable. Interestingly, the CT scanner's tube current modulation reduces the average organ doses for the cohort as expected, but the individual organ dose variability remains similar to those from scans having a fixed tube current.

Conclusions: Using newly available computational tools, this study has demonstrated the feasibility of conducting big data analysis towards CT dose data mining and uncertainty quantification. Results show that inter-individual variability is significant and can be taken into account in an effort to improve CT dosimetry.

基于10281个受试者的CT剂量测量的不确定度定量自动图像分割和快速蒙特卡罗计算。
背景:计算机断层扫描(CT)是世界范围内医疗辐射暴露的主要来源。在中国等国家,CT扫描的频率迅速增长,从而可以获得大量的器官剂量信息。利用现代计算方法,我们现在能够克服自动器官分割和快速蒙特卡罗(MC)剂量计算的挑战。我们假设有可能处理大量的患者特异性器官剂量数据集,以便量化和理解与个体间变异性相关的CT剂量不确定性的范围。目的:在本文中,我们提出了一种将自动图像分割与gpu加速MC模拟相结合的新方法,用于重建在中国一家医院接受CT检查的10281名患者(6419名男性和3862名女性)的患者特异性器官剂量。通过数据挖掘和比较,我们分析了器官剂量分布模式,以研究依赖于简化的幻影人口平均患者模型的CT剂量测定方法可能存在的不确定性。方法:我们的数据处理工作流程包括三个关键步骤。首先,我们从医院的数据库中收集并匿名化CT图像和受试者的健康指标(年龄、性别、身高和体重)。其次,我们利用基于深度学习的分割工具DeepContour从CT图像中自动描绘器官,然后使用经过验证的GE扫描仪模型和ARCEHR-CT软件进行gpu加速的MC器官剂量计算。最后,我们对肺、心、胸、食道、胃、肝、胰、脾八个器官的剂量进行了全面的统计分析。结果:使用单个NVIDIA RTX 3080 GPU卡,以每天600个单独CT剂量数据集的速度处理整个队列的数据需要16天。结果显示,即使只比较具有相似身体质量指数(BMI)或水当量直径(WED)的受试者,器官剂量也存在深刻的个体差异。统计分析表明,数据拟合——一种经常用于分析CT剂量学趋势的方法——可能导致本队列研究数据的相对误差超过50%。统计分析还揭示了器官剂量与健康指标之间的定量相关性,包括体重、BMI、WED和尺寸特异性剂量估计(SSDE),这表明只要不确定性得到充分理解和容忍,这些因素仍然可以作为间接剂量估计的替代品。有趣的是,正如预期的那样,CT扫描仪的管电流调制降低了队列的平均器官剂量,但个体器官剂量的变化仍然与具有固定管电流的扫描相似。结论:利用最新的计算工具,本研究证明了在CT剂量数据挖掘和不确定度量化中进行大数据分析的可行性。结果表明,个体间的差异是显著的,可以在努力改进CT剂量学中加以考虑。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信