Hua-Chieh Shao, Tielige Mengke, Tinsu Pan, You Zhang
{"title":"通过动态重建和运动估计(DREME)联合框架(DREME),利用单个任意角度的 X 射线投影进行实时 CBCT 成像和运动跟踪","authors":"Hua-Chieh Shao, Tielige Mengke, Tinsu Pan, You Zhang","doi":"arxiv-2409.04614","DOIUrl":null,"url":null,"abstract":"Real-time cone-beam computed tomography (CBCT) provides instantaneous\nvisualization of patient anatomy for image guidance, motion tracking, and\nonline treatment adaptation in radiotherapy. While many real-time imaging and\nmotion tracking methods leveraged patient-specific prior information to\nalleviate under-sampling challenges and meet the temporal constraint (< 500\nms), the prior information can be outdated and introduce biases, thus\ncompromising the imaging and motion tracking accuracy. To address this\nchallenge, we developed a framework (DREME) for real-time CBCT imaging and\nmotion estimation, without relying on patient-specific prior knowledge. DREME\nincorporates a deep learning-based real-time CBCT imaging and motion estimation\nmethod into a dynamic CBCT reconstruction framework. The reconstruction\nframework reconstructs a dynamic sequence of CBCTs in a data-driven manner from\na standard pre-treatment scan, without utilizing patient-specific knowledge.\nMeanwhile, a convolutional neural network-based motion encoder is jointly\ntrained during the reconstruction to learn motion-related features relevant for\nreal-time motion estimation, based on a single arbitrarily-angled x-ray\nprojection. DREME was tested on digital phantom simulation and real patient\nstudies. DREME accurately solved 3D respiration-induced anatomic motion in real\ntime (~1.5 ms inference time for each x-ray projection). In the digital phantom\nstudy, it achieved an average lung tumor center-of-mass localization error of\n1.2$\\pm$0.9 mm (Mean$\\pm$SD). In the patient study, it achieved a real-time\ntumor localization accuracy of 1.8$\\pm$1.6 mm in the projection domain. DREME\nachieves CBCT and volumetric motion estimation in real time from a single x-ray\nprojection at arbitrary angles, paving the way for future clinical applications\nin intra-fractional motion management.","PeriodicalId":501378,"journal":{"name":"arXiv - PHYS - Medical Physics","volume":"8 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Real-time CBCT Imaging and Motion Tracking via a Single Arbitrarily-angled X-ray Projection by a Joint Dynamic Reconstruction and Motion Estimation (DREME) Framework (DREME) Framework\",\"authors\":\"Hua-Chieh Shao, Tielige Mengke, Tinsu Pan, You Zhang\",\"doi\":\"arxiv-2409.04614\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Real-time cone-beam computed tomography (CBCT) provides instantaneous\\nvisualization of patient anatomy for image guidance, motion tracking, and\\nonline treatment adaptation in radiotherapy. While many real-time imaging and\\nmotion tracking methods leveraged patient-specific prior information to\\nalleviate under-sampling challenges and meet the temporal constraint (< 500\\nms), the prior information can be outdated and introduce biases, thus\\ncompromising the imaging and motion tracking accuracy. To address this\\nchallenge, we developed a framework (DREME) for real-time CBCT imaging and\\nmotion estimation, without relying on patient-specific prior knowledge. DREME\\nincorporates a deep learning-based real-time CBCT imaging and motion estimation\\nmethod into a dynamic CBCT reconstruction framework. The reconstruction\\nframework reconstructs a dynamic sequence of CBCTs in a data-driven manner from\\na standard pre-treatment scan, without utilizing patient-specific knowledge.\\nMeanwhile, a convolutional neural network-based motion encoder is jointly\\ntrained during the reconstruction to learn motion-related features relevant for\\nreal-time motion estimation, based on a single arbitrarily-angled x-ray\\nprojection. DREME was tested on digital phantom simulation and real patient\\nstudies. DREME accurately solved 3D respiration-induced anatomic motion in real\\ntime (~1.5 ms inference time for each x-ray projection). In the digital phantom\\nstudy, it achieved an average lung tumor center-of-mass localization error of\\n1.2$\\\\pm$0.9 mm (Mean$\\\\pm$SD). In the patient study, it achieved a real-time\\ntumor localization accuracy of 1.8$\\\\pm$1.6 mm in the projection domain. DREME\\nachieves CBCT and volumetric motion estimation in real time from a single x-ray\\nprojection at arbitrary angles, paving the way for future clinical applications\\nin intra-fractional motion management.\",\"PeriodicalId\":501378,\"journal\":{\"name\":\"arXiv - PHYS - Medical Physics\",\"volume\":\"8 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Medical Physics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.04614\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Medical Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.04614","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-time CBCT Imaging and Motion Tracking via a Single Arbitrarily-angled X-ray Projection by a Joint Dynamic Reconstruction and Motion Estimation (DREME) Framework (DREME) Framework
Real-time cone-beam computed tomography (CBCT) provides instantaneous
visualization of patient anatomy for image guidance, motion tracking, and
online treatment adaptation in radiotherapy. While many real-time imaging and
motion tracking methods leveraged patient-specific prior information to
alleviate under-sampling challenges and meet the temporal constraint (< 500
ms), the prior information can be outdated and introduce biases, thus
compromising the imaging and motion tracking accuracy. To address this
challenge, we developed a framework (DREME) for real-time CBCT imaging and
motion estimation, without relying on patient-specific prior knowledge. DREME
incorporates a deep learning-based real-time CBCT imaging and motion estimation
method into a dynamic CBCT reconstruction framework. The reconstruction
framework reconstructs a dynamic sequence of CBCTs in a data-driven manner from
a standard pre-treatment scan, without utilizing patient-specific knowledge.
Meanwhile, a convolutional neural network-based motion encoder is jointly
trained during the reconstruction to learn motion-related features relevant for
real-time motion estimation, based on a single arbitrarily-angled x-ray
projection. DREME was tested on digital phantom simulation and real patient
studies. DREME accurately solved 3D respiration-induced anatomic motion in real
time (~1.5 ms inference time for each x-ray projection). In the digital phantom
study, it achieved an average lung tumor center-of-mass localization error of
1.2$\pm$0.9 mm (Mean$\pm$SD). In the patient study, it achieved a real-time
tumor localization accuracy of 1.8$\pm$1.6 mm in the projection domain. DREME
achieves CBCT and volumetric motion estimation in real time from a single x-ray
projection at arbitrary angles, paving the way for future clinical applications
in intra-fractional motion management.