FDuDoCLNet: Fully dual-domain contrastive learning network for parallel MRI reconstruction

IF 2.1 4区 医学 Q2 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
Huiyao Zhang , Tiejun Yang , Heng Wang , Jiacheng Fan , Wenjie Zhang , Mingzhu Ji
{"title":"FDuDoCLNet: Fully dual-domain contrastive learning network for parallel MRI reconstruction","authors":"Huiyao Zhang ,&nbsp;Tiejun Yang ,&nbsp;Heng Wang ,&nbsp;Jiacheng Fan ,&nbsp;Wenjie Zhang ,&nbsp;Mingzhu Ji","doi":"10.1016/j.mri.2025.110336","DOIUrl":null,"url":null,"abstract":"<div><div>Magnetic resonance imaging (MRI) is a non-invasive medical imaging technique that is widely used for high-resolution imaging of soft tissues and organs. However, the slow speed of MRI imaging, especially in high-resolution or dynamic scans, makes MRI reconstruction an important research topic. Currently, MRI reconstruction methods based on deep learning (DL) have garnered significant attention, and they improve the reconstruction quality by learning complex image features. However, DL-based MR image reconstruction methods exhibit certain limitations. First, the existing reconstruction networks seldom account for the diverse frequency features in the wavelet domain. Second, existing dual-domain reconstruction methods may pay too much attention to the features of a single domain (such as the global information in the image domain or the local details in the wavelet domain), resulting in the loss of either critical global structures or fine details in certain regions of the reconstructed image. In this work, inspired by the lifting scheme in wavelet theory, we propose a novel Fully Dual-Domain Contrastive Learning Network (FDuDoCLNet) based on variational networks (VarNet) for accelerating PI in both the image and wavelet domains. It is composed of several cascaded dual-domain regularization units and data consistency (DC) layers, in which a novel dual-domain contrastive loss is introduced to optimize the reconstruction performance effectively. The proposed FDuDoCLNet was evaluated on the publicly available fastMRI multi-coil knee dataset under a 6× acceleration factor, achieving a PSNR of 34.439 dB and a SSIM of 0.895.</div></div>","PeriodicalId":18165,"journal":{"name":"Magnetic resonance imaging","volume":"117 ","pages":"Article 110336"},"PeriodicalIF":2.1000,"publicationDate":"2025-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Magnetic resonance imaging","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0730725X25000189","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0

Abstract

Magnetic resonance imaging (MRI) is a non-invasive medical imaging technique that is widely used for high-resolution imaging of soft tissues and organs. However, the slow speed of MRI imaging, especially in high-resolution or dynamic scans, makes MRI reconstruction an important research topic. Currently, MRI reconstruction methods based on deep learning (DL) have garnered significant attention, and they improve the reconstruction quality by learning complex image features. However, DL-based MR image reconstruction methods exhibit certain limitations. First, the existing reconstruction networks seldom account for the diverse frequency features in the wavelet domain. Second, existing dual-domain reconstruction methods may pay too much attention to the features of a single domain (such as the global information in the image domain or the local details in the wavelet domain), resulting in the loss of either critical global structures or fine details in certain regions of the reconstructed image. In this work, inspired by the lifting scheme in wavelet theory, we propose a novel Fully Dual-Domain Contrastive Learning Network (FDuDoCLNet) based on variational networks (VarNet) for accelerating PI in both the image and wavelet domains. It is composed of several cascaded dual-domain regularization units and data consistency (DC) layers, in which a novel dual-domain contrastive loss is introduced to optimize the reconstruction performance effectively. The proposed FDuDoCLNet was evaluated on the publicly available fastMRI multi-coil knee dataset under a 6× acceleration factor, achieving a PSNR of 34.439 dB and a SSIM of 0.895.
FDuDoCLNet:用于并行MRI重建的全双域对比学习网络。
磁共振成像(MRI)是一种非侵入性医学成像技术,广泛用于软组织和器官的高分辨率成像。然而,MRI成像速度慢,特别是在高分辨率或动态扫描中,使得MRI重建成为一个重要的研究课题。目前,基于深度学习(deep learning, DL)的MRI重建方法备受关注,该方法通过学习复杂的图像特征来提高重建质量。然而,基于dl的MR图像重建方法存在一定的局限性。首先,现有的重构网络很少考虑到小波域的频率变化特征。其次,现有的双域重建方法可能过于关注单域的特征(如图像域中的全局信息或小波域中的局部细节),从而导致重构图像中某些区域的关键全局结构或精细细节的丢失。在这项工作中,受小波理论中的提升方案的启发,我们提出了一种新的基于变分网络(VarNet)的全双域对比学习网络(FDuDoCLNet),用于加速图像和小波域的PI。它由多个级联的双域正则化单元和数据一致性层组成,其中引入了一种新的双域对比损失,有效地优化了重构性能。FDuDoCLNet在公开的fastMRI多线圈膝关节数据集上进行了6倍加速因子的评估,PSNR为34.439 dB, SSIM为0.895。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Magnetic resonance imaging
Magnetic resonance imaging 医学-核医学
CiteScore
4.70
自引率
4.00%
发文量
194
审稿时长
83 days
期刊介绍: Magnetic Resonance Imaging (MRI) is the first international multidisciplinary journal encompassing physical, life, and clinical science investigations as they relate to the development and use of magnetic resonance imaging. MRI is dedicated to both basic research, technological innovation and applications, providing a single forum for communication among radiologists, physicists, chemists, biochemists, biologists, engineers, internists, pathologists, physiologists, computer scientists, and mathematicians.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信