使用生成对抗网络的3特斯拉磁共振成像到7特斯拉的跨模态图像转换

IF 3.3 2区 医学 Q1 NEUROIMAGING
Eduardo Diniz, Tales Santini, Helmet Karim, Howard J. Aizenstein, Tamer S. Ibrahim
{"title":"使用生成对抗网络的3特斯拉磁共振成像到7特斯拉的跨模态图像转换","authors":"Eduardo Diniz,&nbsp;Tales Santini,&nbsp;Helmet Karim,&nbsp;Howard J. Aizenstein,&nbsp;Tamer S. Ibrahim","doi":"10.1002/hbm.70246","DOIUrl":null,"url":null,"abstract":"<p>The rapid advancements in magnetic resonance imaging (MRI) technology have precipitated a new paradigm wherein cross-modality data translation across diverse imaging platforms, field strengths, and different sites is increasingly challenging. This issue is particularly accentuated when transitioning from 3 Tesla (3T) to 7 Tesla (7T) MRI systems. This study proposes a novel solution to these challenges using generative adversarial networks (GANs)—specifically, the CycleGAN architecture—to create synthetic 7T images from 3T data. Employing a dataset of 1112 and 490 unpaired 3T and 7T MR images, respectively, we trained a 2-dimensional (2D) CycleGAN model, evaluating its performance on a paired dataset of 22 participants scanned at 3T and 7T. Independent testing on 22 distinct participants affirmed the model's proficiency in accurately predicting various tissue types, encompassing cerebral spinal fluid, gray matter, and white matter. Our approach provides a reliable and efficient methodology for synthesizing 7T images, achieving a median Dice coefficient of 83.62% for cerebral spinal fluid (CSF), 81.42% for gray matter (GM), and 89.75% for White Matter (WM), while the corresponding median Percentual Area Differences (PAD) were 6.82%, 7.63%, and 4.85% for CSF, GM, and WM, respectively, in the testing dataset, thereby aiding in harmonizing heterogeneous datasets. Furthermore, it delineates the potential of GANs in amplifying the contrast-to-noise ratio (CNR) from 3T, potentially enhancing the diagnostic capability of the images. While acknowledging the risk of model overfitting, our research underscores a promising progression toward harnessing the benefits of 7T MR systems in research investigations while preserving compatibility with existing 3T MR data. This work was previously presented at the ISMRM 2021 conference.</p>","PeriodicalId":13019,"journal":{"name":"Human Brain Mapping","volume":"46 9","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2025-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/hbm.70246","citationCount":"0","resultStr":"{\"title\":\"Cross-Modality Image Translation of 3 Tesla Magnetic Resonance Imaging to 7 Tesla Using Generative Adversarial Networks\",\"authors\":\"Eduardo Diniz,&nbsp;Tales Santini,&nbsp;Helmet Karim,&nbsp;Howard J. Aizenstein,&nbsp;Tamer S. Ibrahim\",\"doi\":\"10.1002/hbm.70246\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The rapid advancements in magnetic resonance imaging (MRI) technology have precipitated a new paradigm wherein cross-modality data translation across diverse imaging platforms, field strengths, and different sites is increasingly challenging. This issue is particularly accentuated when transitioning from 3 Tesla (3T) to 7 Tesla (7T) MRI systems. This study proposes a novel solution to these challenges using generative adversarial networks (GANs)—specifically, the CycleGAN architecture—to create synthetic 7T images from 3T data. Employing a dataset of 1112 and 490 unpaired 3T and 7T MR images, respectively, we trained a 2-dimensional (2D) CycleGAN model, evaluating its performance on a paired dataset of 22 participants scanned at 3T and 7T. Independent testing on 22 distinct participants affirmed the model's proficiency in accurately predicting various tissue types, encompassing cerebral spinal fluid, gray matter, and white matter. Our approach provides a reliable and efficient methodology for synthesizing 7T images, achieving a median Dice coefficient of 83.62% for cerebral spinal fluid (CSF), 81.42% for gray matter (GM), and 89.75% for White Matter (WM), while the corresponding median Percentual Area Differences (PAD) were 6.82%, 7.63%, and 4.85% for CSF, GM, and WM, respectively, in the testing dataset, thereby aiding in harmonizing heterogeneous datasets. Furthermore, it delineates the potential of GANs in amplifying the contrast-to-noise ratio (CNR) from 3T, potentially enhancing the diagnostic capability of the images. While acknowledging the risk of model overfitting, our research underscores a promising progression toward harnessing the benefits of 7T MR systems in research investigations while preserving compatibility with existing 3T MR data. This work was previously presented at the ISMRM 2021 conference.</p>\",\"PeriodicalId\":13019,\"journal\":{\"name\":\"Human Brain Mapping\",\"volume\":\"46 9\",\"pages\":\"\"},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2025-06-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/hbm.70246\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Human Brain Mapping\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/hbm.70246\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"NEUROIMAGING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human Brain Mapping","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/hbm.70246","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"NEUROIMAGING","Score":null,"Total":0}
引用次数: 0

摘要

磁共振成像(MRI)技术的快速发展催生了一种新的范式,其中跨不同成像平台、场强度和不同地点的跨模态数据转换越来越具有挑战性。当从3特斯拉(3T)到7特斯拉(7T) MRI系统过渡时,这个问题尤其突出。本研究提出了一种新的解决方案,使用生成对抗网络(gan) -特别是CycleGAN架构-从3T数据创建合成7T图像来应对这些挑战。利用1112张和490张未配对的3T和7T MR图像数据集,我们训练了一个二维(2D) CycleGAN模型,评估了其在3T和7T扫描的22名参与者的配对数据集上的性能。对22名不同参与者的独立测试证实了该模型在准确预测各种组织类型(包括脑脊液、灰质和白质)方面的熟练程度。我们的方法为合成7T图像提供了一种可靠而高效的方法,在测试数据集中,脑脊液(CSF)、灰质(GM)和白质(WM)的中位Dice系数分别为83.62%、81.42%和89.75%,而相应的中位百分比面积差异(PAD)分别为6.82%、7.63%和4.85%,从而有助于协调异构数据集。此外,它描述了gan在放大3T的对比噪声比(CNR)方面的潜力,潜在地增强了图像的诊断能力。在承认模型过拟合风险的同时,我们的研究强调了在研究调查中利用7T MR系统的优势,同时保持与现有3T MR数据的兼容性的有希望的进展。这项工作之前在ISMRM 2021会议上提出。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Cross-Modality Image Translation of 3 Tesla Magnetic Resonance Imaging to 7 Tesla Using Generative Adversarial Networks

Cross-Modality Image Translation of 3 Tesla Magnetic Resonance Imaging to 7 Tesla Using Generative Adversarial Networks

The rapid advancements in magnetic resonance imaging (MRI) technology have precipitated a new paradigm wherein cross-modality data translation across diverse imaging platforms, field strengths, and different sites is increasingly challenging. This issue is particularly accentuated when transitioning from 3 Tesla (3T) to 7 Tesla (7T) MRI systems. This study proposes a novel solution to these challenges using generative adversarial networks (GANs)—specifically, the CycleGAN architecture—to create synthetic 7T images from 3T data. Employing a dataset of 1112 and 490 unpaired 3T and 7T MR images, respectively, we trained a 2-dimensional (2D) CycleGAN model, evaluating its performance on a paired dataset of 22 participants scanned at 3T and 7T. Independent testing on 22 distinct participants affirmed the model's proficiency in accurately predicting various tissue types, encompassing cerebral spinal fluid, gray matter, and white matter. Our approach provides a reliable and efficient methodology for synthesizing 7T images, achieving a median Dice coefficient of 83.62% for cerebral spinal fluid (CSF), 81.42% for gray matter (GM), and 89.75% for White Matter (WM), while the corresponding median Percentual Area Differences (PAD) were 6.82%, 7.63%, and 4.85% for CSF, GM, and WM, respectively, in the testing dataset, thereby aiding in harmonizing heterogeneous datasets. Furthermore, it delineates the potential of GANs in amplifying the contrast-to-noise ratio (CNR) from 3T, potentially enhancing the diagnostic capability of the images. While acknowledging the risk of model overfitting, our research underscores a promising progression toward harnessing the benefits of 7T MR systems in research investigations while preserving compatibility with existing 3T MR data. This work was previously presented at the ISMRM 2021 conference.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Human Brain Mapping
Human Brain Mapping 医学-核医学
CiteScore
8.30
自引率
6.20%
发文量
401
审稿时长
3-6 weeks
期刊介绍: Human Brain Mapping publishes peer-reviewed basic, clinical, technical, and theoretical research in the interdisciplinary and rapidly expanding field of human brain mapping. The journal features research derived from non-invasive brain imaging modalities used to explore the spatial and temporal organization of the neural systems supporting human behavior. Imaging modalities of interest include positron emission tomography, event-related potentials, electro-and magnetoencephalography, magnetic resonance imaging, and single-photon emission tomography. Brain mapping research in both normal and clinical populations is encouraged. Article formats include Research Articles, Review Articles, Clinical Case Studies, and Technique, as well as Technological Developments, Theoretical Articles, and Synthetic Reviews. Technical advances, such as novel brain imaging methods, analyses for detecting or localizing neural activity, synergistic uses of multiple imaging modalities, and strategies for the design of behavioral paradigms and neural-systems modeling are of particular interest. The journal endorses the propagation of methodological standards and encourages database development in the field of human brain mapping.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信