基于模型的深度学习与全连接神经网络加速磁共振参数映射。

IF 2.3 3区 医学 Q3 ENGINEERING, BIOMEDICAL
Naoto Fujita, Suguru Yokosawa, Toru Shirai, Yasuhiko Terada
{"title":"基于模型的深度学习与全连接神经网络加速磁共振参数映射。","authors":"Naoto Fujita, Suguru Yokosawa, Toru Shirai, Yasuhiko Terada","doi":"10.1007/s11548-025-03356-7","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Quantitative magnetic resonance imaging (qMRI) enables imaging of physical parameters related to the nuclear spin of protons in tissue, and is poised to revolutionize clinical research. However, improving the accuracy and clinical relevance of qMRI is essential for its practical implementation. This requires significantly reducing the currently lengthy acquisition times to enable clinical examinations and provide an environment where clinical accuracy and reliability can be verified. Deep learning (DL) has shown promise in significantly reducing imaging time and improving image quality in recent years. This study introduces a novel approach, quantitative deep cascade of convolutional network (qDC-CNN), as a framework for accelerated quantitative parameter mapping, offering a potential solution to this challenge. This work aims to verify that the proposed model outperforms the competing methods.</p><p><strong>Methods: </strong>The proposed qDC-CNN is an integrated deep-learning framework combining an unrolled image reconstruction network and a fully connected neural network for parameter estimation. Training and testing utilized simulated multi-slice multi-echo (MSME) datasets generated from the BrainWeb database. The reconstruction error with ground truth was evaluated using normalized root mean squared error (NRMSE) and compared with conventional DL-based methods. Two validation experiments were performed: (Experiment 1) assessment of acceleration factor (AF) dependency (AF = 5, 10, 20) with fixed 16 echoes, and (Experiment 2) evaluation of the impact of reducing contrast images (16, 8, 4 images).</p><p><strong>Results: </strong>In most cases, the NRMSE values of S0 and T2 estimated from the proposed qDC-CNN were within 10%. In particular, the NRMSE values of T2 were much smaller than those of the conventional methods.</p><p><strong>Conclusions: </strong>The proposed model had significantly smaller reconstruction errors than the conventional models. The proposed method can be applied to other qMRI sequences and has the flexibility to replace the image reconstruction module to improve performance.</p>","PeriodicalId":51251,"journal":{"name":"International Journal of Computer Assisted Radiology and Surgery","volume":" ","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2025-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Model-based deep learning with fully connected neural networks for accelerated magnetic resonance parameter mapping.\",\"authors\":\"Naoto Fujita, Suguru Yokosawa, Toru Shirai, Yasuhiko Terada\",\"doi\":\"10.1007/s11548-025-03356-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>Quantitative magnetic resonance imaging (qMRI) enables imaging of physical parameters related to the nuclear spin of protons in tissue, and is poised to revolutionize clinical research. However, improving the accuracy and clinical relevance of qMRI is essential for its practical implementation. This requires significantly reducing the currently lengthy acquisition times to enable clinical examinations and provide an environment where clinical accuracy and reliability can be verified. Deep learning (DL) has shown promise in significantly reducing imaging time and improving image quality in recent years. This study introduces a novel approach, quantitative deep cascade of convolutional network (qDC-CNN), as a framework for accelerated quantitative parameter mapping, offering a potential solution to this challenge. This work aims to verify that the proposed model outperforms the competing methods.</p><p><strong>Methods: </strong>The proposed qDC-CNN is an integrated deep-learning framework combining an unrolled image reconstruction network and a fully connected neural network for parameter estimation. Training and testing utilized simulated multi-slice multi-echo (MSME) datasets generated from the BrainWeb database. The reconstruction error with ground truth was evaluated using normalized root mean squared error (NRMSE) and compared with conventional DL-based methods. Two validation experiments were performed: (Experiment 1) assessment of acceleration factor (AF) dependency (AF = 5, 10, 20) with fixed 16 echoes, and (Experiment 2) evaluation of the impact of reducing contrast images (16, 8, 4 images).</p><p><strong>Results: </strong>In most cases, the NRMSE values of S0 and T2 estimated from the proposed qDC-CNN were within 10%. In particular, the NRMSE values of T2 were much smaller than those of the conventional methods.</p><p><strong>Conclusions: </strong>The proposed model had significantly smaller reconstruction errors than the conventional models. The proposed method can be applied to other qMRI sequences and has the flexibility to replace the image reconstruction module to improve performance.</p>\",\"PeriodicalId\":51251,\"journal\":{\"name\":\"International Journal of Computer Assisted Radiology and Surgery\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2025-05-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Computer Assisted Radiology and Surgery\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1007/s11548-025-03356-7\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Assisted Radiology and Surgery","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11548-025-03356-7","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

摘要

目的:定量磁共振成像(qMRI)能够成像与组织中质子核自旋相关的物理参数,并有望彻底改变临床研究。然而,提高qMRI的准确性和临床相关性对于其实际实施至关重要。这需要显著减少目前漫长的采集时间,以便临床检查,并提供一个可以验证临床准确性和可靠性的环境。近年来,深度学习(DL)在显著缩短成像时间和提高图像质量方面显示出了前景。本研究引入了一种新颖的方法,定量深度级联卷积网络(qDC-CNN),作为加速定量参数映射的框架,为这一挑战提供了一个潜在的解决方案。这项工作旨在验证所提出的模型优于竞争方法。方法:提出的qDC-CNN是一个集成的深度学习框架,结合了展开图像重建网络和用于参数估计的全连接神经网络。训练和测试利用从BrainWeb数据库生成的模拟多片多回波(MSME)数据集。采用归一化均方根误差(NRMSE)评价了具有真实值的重建误差,并与传统的基于dl的方法进行了比较。我们进行了两个验证实验:(实验1)评估固定16个回波时加速度因子(AF = 5、10、20)的依赖性,(实验2)评估降低对比度图像(16、8、4)的影响。结果:在大多数情况下,根据所提出的qDC-CNN估计的S0和T2的NRMSE值在10%以内。特别是T2的NRMSE值比常规方法要小得多。结论:该模型的重建误差明显小于常规模型。该方法可以应用于其他qMRI序列,并且可以灵活地替换图像重建模块以提高性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Model-based deep learning with fully connected neural networks for accelerated magnetic resonance parameter mapping.

Purpose: Quantitative magnetic resonance imaging (qMRI) enables imaging of physical parameters related to the nuclear spin of protons in tissue, and is poised to revolutionize clinical research. However, improving the accuracy and clinical relevance of qMRI is essential for its practical implementation. This requires significantly reducing the currently lengthy acquisition times to enable clinical examinations and provide an environment where clinical accuracy and reliability can be verified. Deep learning (DL) has shown promise in significantly reducing imaging time and improving image quality in recent years. This study introduces a novel approach, quantitative deep cascade of convolutional network (qDC-CNN), as a framework for accelerated quantitative parameter mapping, offering a potential solution to this challenge. This work aims to verify that the proposed model outperforms the competing methods.

Methods: The proposed qDC-CNN is an integrated deep-learning framework combining an unrolled image reconstruction network and a fully connected neural network for parameter estimation. Training and testing utilized simulated multi-slice multi-echo (MSME) datasets generated from the BrainWeb database. The reconstruction error with ground truth was evaluated using normalized root mean squared error (NRMSE) and compared with conventional DL-based methods. Two validation experiments were performed: (Experiment 1) assessment of acceleration factor (AF) dependency (AF = 5, 10, 20) with fixed 16 echoes, and (Experiment 2) evaluation of the impact of reducing contrast images (16, 8, 4 images).

Results: In most cases, the NRMSE values of S0 and T2 estimated from the proposed qDC-CNN were within 10%. In particular, the NRMSE values of T2 were much smaller than those of the conventional methods.

Conclusions: The proposed model had significantly smaller reconstruction errors than the conventional models. The proposed method can be applied to other qMRI sequences and has the flexibility to replace the image reconstruction module to improve performance.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
International Journal of Computer Assisted Radiology and Surgery
International Journal of Computer Assisted Radiology and Surgery ENGINEERING, BIOMEDICAL-RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
CiteScore
5.90
自引率
6.70%
发文量
243
审稿时长
6-12 weeks
期刊介绍: The International Journal for Computer Assisted Radiology and Surgery (IJCARS) is a peer-reviewed journal that provides a platform for closing the gap between medical and technical disciplines, and encourages interdisciplinary research and development activities in an international environment.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信