{"title":"通道空间注意力引导的 CycleGAN 用于基于 CBCT 的合成 CT 生成,以实现自适应放疗","authors":"Yangchuan Liu;Shimin Liao;Yechen Zhu;Fuxing Deng;Zijian Zhang;Xin Gao;Tingting Cheng","doi":"10.1109/TCI.2024.3402372","DOIUrl":null,"url":null,"abstract":"Cone-beam computed tomography (CBCT) is the most commonly used 3D imaging modality in image-guided radiotherapy. However, severe artifacts and inaccurate Hounsfield units render CBCT images directly unusable for dose calculations in radiotherapy planning. The deformed pCT (dpCT) image produced by aligning the planning CT (pCT) image with the CBCT image can be viewed as the corrected CBCT image. However, when the interval between pCT and CBCT scans is long, the alignment error increases, which reduces the accuracy of dose calculations based on dpCT images. This study introduces a channel-spatial attention-guided cycle-consistent generative adversarial network (cycleGAN) called TranSE-cycleGAN, which learns mapping from CBCT to dpCT images and generates synthetic CT (sCT) images similar to dpCT images to achieve CBCT image correction. To enhance the network's ability to extract global features that reflect the overall noise and artifact distribution of the image, a TranSE branch, which is composed of a SELayer and an improved window-based transformer, was added parallel to the original residual convolution branch to the cycleGAN generator. To evaluate the proposed network, we collected data from 51 patients with head-and-neck cancer who underwent both pCT and CBCT scans. Among these, 45 were used for network training, and 6 were used for network testing. The results of the comparison experiments with cycleGAN and respath-cycleGAN demonstrate that the proposed TranSE-cycleGAN excels not only in image quality evaluation metrics, including mean absolute error, root mean square error, peak signal-to-noise ratio, and structural similarity but also in the Gamma index pass rate, a metric for dose accuracy evaluation. The superiority of the proposed method indicates its potential value in adaptive radiotherapy.","PeriodicalId":56022,"journal":{"name":"IEEE Transactions on Computational Imaging","volume":"10 ","pages":"818-831"},"PeriodicalIF":4.2000,"publicationDate":"2024-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Channel-Spatial Attention Guided CycleGAN for CBCT-Based Synthetic CT Generation to Enable Adaptive Radiotherapy\",\"authors\":\"Yangchuan Liu;Shimin Liao;Yechen Zhu;Fuxing Deng;Zijian Zhang;Xin Gao;Tingting Cheng\",\"doi\":\"10.1109/TCI.2024.3402372\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cone-beam computed tomography (CBCT) is the most commonly used 3D imaging modality in image-guided radiotherapy. However, severe artifacts and inaccurate Hounsfield units render CBCT images directly unusable for dose calculations in radiotherapy planning. The deformed pCT (dpCT) image produced by aligning the planning CT (pCT) image with the CBCT image can be viewed as the corrected CBCT image. However, when the interval between pCT and CBCT scans is long, the alignment error increases, which reduces the accuracy of dose calculations based on dpCT images. This study introduces a channel-spatial attention-guided cycle-consistent generative adversarial network (cycleGAN) called TranSE-cycleGAN, which learns mapping from CBCT to dpCT images and generates synthetic CT (sCT) images similar to dpCT images to achieve CBCT image correction. To enhance the network's ability to extract global features that reflect the overall noise and artifact distribution of the image, a TranSE branch, which is composed of a SELayer and an improved window-based transformer, was added parallel to the original residual convolution branch to the cycleGAN generator. To evaluate the proposed network, we collected data from 51 patients with head-and-neck cancer who underwent both pCT and CBCT scans. Among these, 45 were used for network training, and 6 were used for network testing. The results of the comparison experiments with cycleGAN and respath-cycleGAN demonstrate that the proposed TranSE-cycleGAN excels not only in image quality evaluation metrics, including mean absolute error, root mean square error, peak signal-to-noise ratio, and structural similarity but also in the Gamma index pass rate, a metric for dose accuracy evaluation. The superiority of the proposed method indicates its potential value in adaptive radiotherapy.\",\"PeriodicalId\":56022,\"journal\":{\"name\":\"IEEE Transactions on Computational Imaging\",\"volume\":\"10 \",\"pages\":\"818-831\"},\"PeriodicalIF\":4.2000,\"publicationDate\":\"2024-03-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Computational Imaging\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10535271/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Computational Imaging","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10535271/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Channel-Spatial Attention Guided CycleGAN for CBCT-Based Synthetic CT Generation to Enable Adaptive Radiotherapy
Cone-beam computed tomography (CBCT) is the most commonly used 3D imaging modality in image-guided radiotherapy. However, severe artifacts and inaccurate Hounsfield units render CBCT images directly unusable for dose calculations in radiotherapy planning. The deformed pCT (dpCT) image produced by aligning the planning CT (pCT) image with the CBCT image can be viewed as the corrected CBCT image. However, when the interval between pCT and CBCT scans is long, the alignment error increases, which reduces the accuracy of dose calculations based on dpCT images. This study introduces a channel-spatial attention-guided cycle-consistent generative adversarial network (cycleGAN) called TranSE-cycleGAN, which learns mapping from CBCT to dpCT images and generates synthetic CT (sCT) images similar to dpCT images to achieve CBCT image correction. To enhance the network's ability to extract global features that reflect the overall noise and artifact distribution of the image, a TranSE branch, which is composed of a SELayer and an improved window-based transformer, was added parallel to the original residual convolution branch to the cycleGAN generator. To evaluate the proposed network, we collected data from 51 patients with head-and-neck cancer who underwent both pCT and CBCT scans. Among these, 45 were used for network training, and 6 were used for network testing. The results of the comparison experiments with cycleGAN and respath-cycleGAN demonstrate that the proposed TranSE-cycleGAN excels not only in image quality evaluation metrics, including mean absolute error, root mean square error, peak signal-to-noise ratio, and structural similarity but also in the Gamma index pass rate, a metric for dose accuracy evaluation. The superiority of the proposed method indicates its potential value in adaptive radiotherapy.
期刊介绍:
The IEEE Transactions on Computational Imaging will publish articles where computation plays an integral role in the image formation process. Papers will cover all areas of computational imaging ranging from fundamental theoretical methods to the latest innovative computational imaging system designs. Topics of interest will include advanced algorithms and mathematical techniques, model-based data inversion, methods for image and signal recovery from sparse and incomplete data, techniques for non-traditional sensing of image data, methods for dynamic information acquisition and extraction from imaging sensors, software and hardware for efficient computation in imaging systems, and highly novel imaging system design.