A dual-domain network with division residual connection and feature fusion for CBCT scatter correction.

IF 3.3 3区 医学 Q2 ENGINEERING, BIOMEDICAL
Shuo Yang, Zhe Wang, Linjie Chen, Ying Cheng, Huamin Wang, Xiao Bai, Guohua Cao
{"title":"A dual-domain network with division residual connection and feature fusion for CBCT scatter correction.","authors":"Shuo Yang, Zhe Wang, Linjie Chen, Ying Cheng, Huamin Wang, Xiao Bai, Guohua Cao","doi":"10.1088/1361-6560/adaf06","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>This study aims to propose a dual-domain network that not only reduces scatter artifacts but also retains structure details in cone-beam computed tomography (CBCT).<i>Approach.</i>The proposed network comprises a projection-domain sub-network and an image-domain sub-network. The projection-domain sub-network utilizes a division residual network to amplify the difference between scatter signals and imaging signals, facilitating the learning of scatter signals. The image-domain sub-network contains dual encoders and a single decoder. The dual encoders extract features from two inputs parallelly, and the decoder fuses the extracted features from the two encoders and maps the fused features back to the final high-quality image. Of the two input images to the image-domain sub-network, one is the scatter-contaminated image analytically reconstructed from the scatter-contaminated projections, and the other is the pre-processed image reconstructed from the pre-processed projections produced by the projection-domain sub-network.<i>Main results.</i>Experimental results on both synthetic and real data demonstrate that our method can effectively reduce scatter artifacts and restore image details. Quantitative analysis using synthetic data shows the mean absolute error was reduced by 74% and peak signal-to-noise ratio increased by 57% compared to the scatter-contaminated ones. Testing on real data found a 38% increase in contrast-to-noise ratio with our method compared to the scatter-contaminated image. Additionally, our method consistently outperforms comparative methods such as U-Net, DSE-Net, deep residual convolution neural network (DRCNN) and the collimator-based method.<i>Significance.</i>A dual-domain network that leverages projection-domain division residual connection and image-domain feature fusion has been proposed for CBCT scatter correction. It has potential applications for reducing scatter artifacts and preserving image details in CBCT.</p>","PeriodicalId":20185,"journal":{"name":"Physics in medicine and biology","volume":" ","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2025-02-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physics in medicine and biology","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1088/1361-6560/adaf06","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Objective.This study aims to propose a dual-domain network that not only reduces scatter artifacts but also retains structure details in cone-beam computed tomography (CBCT).Approach.The proposed network comprises a projection-domain sub-network and an image-domain sub-network. The projection-domain sub-network utilizes a division residual network to amplify the difference between scatter signals and imaging signals, facilitating the learning of scatter signals. The image-domain sub-network contains dual encoders and a single decoder. The dual encoders extract features from two inputs parallelly, and the decoder fuses the extracted features from the two encoders and maps the fused features back to the final high-quality image. Of the two input images to the image-domain sub-network, one is the scatter-contaminated image analytically reconstructed from the scatter-contaminated projections, and the other is the pre-processed image reconstructed from the pre-processed projections produced by the projection-domain sub-network.Main results.Experimental results on both synthetic and real data demonstrate that our method can effectively reduce scatter artifacts and restore image details. Quantitative analysis using synthetic data shows the mean absolute error was reduced by 74% and peak signal-to-noise ratio increased by 57% compared to the scatter-contaminated ones. Testing on real data found a 38% increase in contrast-to-noise ratio with our method compared to the scatter-contaminated image. Additionally, our method consistently outperforms comparative methods such as U-Net, DSE-Net, deep residual convolution neural network (DRCNN) and the collimator-based method.Significance.A dual-domain network that leverages projection-domain division residual connection and image-domain feature fusion has been proposed for CBCT scatter correction. It has potential applications for reducing scatter artifacts and preserving image details in CBCT.

求助全文
约1分钟内获得全文 求助全文
来源期刊
Physics in medicine and biology
Physics in medicine and biology 医学-工程:生物医学
CiteScore
6.50
自引率
14.30%
发文量
409
审稿时长
2 months
期刊介绍: The development and application of theoretical, computational and experimental physics to medicine, physiology and biology. Topics covered are: therapy physics (including ionizing and non-ionizing radiation); biomedical imaging (e.g. x-ray, magnetic resonance, ultrasound, optical and nuclear imaging); image-guided interventions; image reconstruction and analysis (including kinetic modelling); artificial intelligence in biomedical physics and analysis; nanoparticles in imaging and therapy; radiobiology; radiation protection and patient dose monitoring; radiation dosimetry
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信