Fast four-dimensional cone-beam computed tomography reconstruction using deformable convolutional networks.

Medical physics Pub Date : 2022-10-01 Epub Date: 2022-06-22 DOI:10.1002/mp.15806
Zhuoran Jiang, Yushi Chang, Zeyu Zhang, Fang-Fang Yin, Lei Ren
{"title":"Fast four-dimensional cone-beam computed tomography reconstruction using deformable convolutional networks.","authors":"Zhuoran Jiang,&nbsp;Yushi Chang,&nbsp;Zeyu Zhang,&nbsp;Fang-Fang Yin,&nbsp;Lei Ren","doi":"10.1002/mp.15806","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Although four-dimensional cone-beam computed tomography (4D-CBCT) is valuable to provide onboard image guidance for radiotherapy of moving targets, it requires a long acquisition time to achieve sufficient image quality for target localization. To improve the utility, it is highly desirable to reduce the 4D-CBCT scanning time while maintaining high-quality images. Current motion-compensated methods are limited by slow speed and compensation errors due to the severe intraphase undersampling.</p><p><strong>Purpose: </strong>In this work, we aim to propose an alternative feature-compensated method to realize the fast 4D-CBCT with high-quality images.</p><p><strong>Methods: </strong>We proposed a feature-compensated deformable convolutional network (FeaCo-DCN) to perform interphase compensation in the latent feature space, which has not been explored by previous studies. In FeaCo-DCN, encoding networks extract features from each phase, and then, features of other phases are deformed to those of the target phase via deformable convolutional networks. Finally, a decoding network combines and decodes features from all phases to yield high-quality images of the target phase. The proposed FeaCo-DCN was evaluated using lung cancer patient data.</p><p><strong>Results: </strong>(1) FeaCo-DCN generated high-quality images with accurate and clear structures for a fast 4D-CBCT scan; (2) 4D-CBCT images reconstructed by FeaCo-DCN achieved 3D tumor localization accuracy within 2.5 mm; (3) image reconstruction is nearly real time; and (4) FeaCo-DCN achieved superior performance by all metrics compared to the top-ranked techniques in the AAPM SPARE Challenge.</p><p><strong>Conclusion: </strong>The proposed FeaCo-DCN is effective and efficient in reconstructing 4D-CBCT while reducing about 90% of the scanning time, which can be highly valuable for moving target localization in image-guided radiotherapy.</p>","PeriodicalId":94136,"journal":{"name":"Medical physics","volume":"49 10","pages":"6461-6476"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9588592/pdf/nihms-1817259.pdf","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/mp.15806","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/6/22 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Background: Although four-dimensional cone-beam computed tomography (4D-CBCT) is valuable to provide onboard image guidance for radiotherapy of moving targets, it requires a long acquisition time to achieve sufficient image quality for target localization. To improve the utility, it is highly desirable to reduce the 4D-CBCT scanning time while maintaining high-quality images. Current motion-compensated methods are limited by slow speed and compensation errors due to the severe intraphase undersampling.

Purpose: In this work, we aim to propose an alternative feature-compensated method to realize the fast 4D-CBCT with high-quality images.

Methods: We proposed a feature-compensated deformable convolutional network (FeaCo-DCN) to perform interphase compensation in the latent feature space, which has not been explored by previous studies. In FeaCo-DCN, encoding networks extract features from each phase, and then, features of other phases are deformed to those of the target phase via deformable convolutional networks. Finally, a decoding network combines and decodes features from all phases to yield high-quality images of the target phase. The proposed FeaCo-DCN was evaluated using lung cancer patient data.

Results: (1) FeaCo-DCN generated high-quality images with accurate and clear structures for a fast 4D-CBCT scan; (2) 4D-CBCT images reconstructed by FeaCo-DCN achieved 3D tumor localization accuracy within 2.5 mm; (3) image reconstruction is nearly real time; and (4) FeaCo-DCN achieved superior performance by all metrics compared to the top-ranked techniques in the AAPM SPARE Challenge.

Conclusion: The proposed FeaCo-DCN is effective and efficient in reconstructing 4D-CBCT while reducing about 90% of the scanning time, which can be highly valuable for moving target localization in image-guided radiotherapy.

使用可变形卷积网络的快速四维锥束计算机断层扫描重建。
背景:尽管四维锥束计算机断层扫描(4D-CBT)在为运动目标的放射治疗提供机载图像引导方面很有价值,但它需要很长的采集时间才能获得足够的图像质量来进行目标定位。为了提高实用性,非常希望在保持高质量图像的同时减少4D-CBT扫描时间。当前的运动补偿方法受到速度慢和由于严重的相位内欠采样引起的补偿误差的限制。目的:在这项工作中,我们旨在提出一种替代的特征补偿方法,以实现高质量图像的快速4D-CBT。方法:我们提出了一种特征补偿的可变形卷积网络(FeaCo-DCN)来在潜在特征空间中进行相间补偿,这是以前的研究没有探索过的。在FeaCo DCN中,编码网络从每个相位提取特征,然后通过可变形卷积网络将其他相位的特征变形为目标相位的特征。最后,解码网络组合并解码来自所有相位的特征,以产生目标相位的高质量图像。使用癌症患者数据对所提出的FeaCo-DCN进行评估。结果:(1)FeaCo DCN生成了高质量的图像,具有准确清晰的结构,用于快速4D-CBCT扫描;(2) FeaCo DCN重建的4D-CBCT图像实现了2.5mm以内的三维肿瘤定位精度;(3) 图像重建几乎是实时的;和(4)与AAPM备用挑战赛中排名靠前的技术相比,FeaCo DCN在所有指标上都取得了卓越的性能。结论:所提出的FeaCo-DCN在重建4D-CBCT方面是有效的,同时减少了约90%的扫描时间,这对图像引导放疗中的运动目标定位具有很高的价值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信