Automated quantification of brain PET in PET/CT using deep learning-based CT-to-MR translation: a feasibility study

IF 8.6 1区 医学 Q1 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
Daesung Kim, Kyobin Choo, Sangwon Lee, Seongjin Kang, Mijin Yun, Jaewon Yang
{"title":"Automated quantification of brain PET in PET/CT using deep learning-based CT-to-MR translation: a feasibility study","authors":"Daesung Kim, Kyobin Choo, Sangwon Lee, Seongjin Kang, Mijin Yun, Jaewon Yang","doi":"10.1007/s00259-025-07132-2","DOIUrl":null,"url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Purpose</h3><p>Quantitative analysis of PET images in brain PET/CT relies on MRI-derived regions of interest (ROIs). However, the pairs of PET/CT and MR images are not always available, and their alignment is challenging if their acquisition times differ considerably. To address these problems, this study proposes a deep learning framework for translating CT of PET/CT to synthetic MR images (MR<sub>SYN</sub>) and performing automated quantitative regional analysis using MR<sub>SYN</sub>-derived segmentation.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>In this retrospective study, 139 subjects who underwent brain [<sup>18</sup>F]FBB PET/CT and T1-weighted MRI were included. A U-Net-like model was trained to translate CT images to MR<sub>SYN</sub>; subsequently, a separate model was trained to segment MR<sub>SYN</sub> into 95 regions. Regional and composite standardised uptake value ratio (SUVr) was calculated in [<sup>18</sup>F]FBB PET images using the acquired ROIs. For evaluation of MR<sub>SYN</sub>, quantitative measurements including structural similarity index measure (SSIM) were employed, while for MR<sub>SYN</sub>-based segmentation evaluation, Dice similarity coefficient (DSC) was calculated. Wilcoxon signed-rank test was performed for SUVrs computed using MR<sub>SYN</sub> and ground-truth MR (MR<sub>GT</sub>).</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>Compared to MR<sub>GT</sub>, the mean SSIM of MR<sub>SYN</sub> was 0.974 ± 0.005. The MR<sub>SYN</sub>-based segmentation achieved a mean DSC of 0.733 across 95 regions. No statistical significance (<i>P</i> &gt; 0.05) was found for SUVr between the ROIs from MR<sub>SYN</sub> and those from MR<sub>GT</sub>, excluding the precuneus.</p><h3 data-test=\"abstract-sub-heading\">Conclusion</h3><p>We demonstrated a deep learning framework for automated regional brain analysis in PET/CT with MR<sub>SYN</sub>. Our proposed framework can benefit patients who have difficulties in performing an MRI scan.</p>","PeriodicalId":11909,"journal":{"name":"European Journal of Nuclear Medicine and Molecular Imaging","volume":"24 1","pages":""},"PeriodicalIF":8.6000,"publicationDate":"2025-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Nuclear Medicine and Molecular Imaging","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1007/s00259-025-07132-2","RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose

Quantitative analysis of PET images in brain PET/CT relies on MRI-derived regions of interest (ROIs). However, the pairs of PET/CT and MR images are not always available, and their alignment is challenging if their acquisition times differ considerably. To address these problems, this study proposes a deep learning framework for translating CT of PET/CT to synthetic MR images (MRSYN) and performing automated quantitative regional analysis using MRSYN-derived segmentation.

Methods

In this retrospective study, 139 subjects who underwent brain [18F]FBB PET/CT and T1-weighted MRI were included. A U-Net-like model was trained to translate CT images to MRSYN; subsequently, a separate model was trained to segment MRSYN into 95 regions. Regional and composite standardised uptake value ratio (SUVr) was calculated in [18F]FBB PET images using the acquired ROIs. For evaluation of MRSYN, quantitative measurements including structural similarity index measure (SSIM) were employed, while for MRSYN-based segmentation evaluation, Dice similarity coefficient (DSC) was calculated. Wilcoxon signed-rank test was performed for SUVrs computed using MRSYN and ground-truth MR (MRGT).

Results

Compared to MRGT, the mean SSIM of MRSYN was 0.974 ± 0.005. The MRSYN-based segmentation achieved a mean DSC of 0.733 across 95 regions. No statistical significance (P > 0.05) was found for SUVr between the ROIs from MRSYN and those from MRGT, excluding the precuneus.

Conclusion

We demonstrated a deep learning framework for automated regional brain analysis in PET/CT with MRSYN. Our proposed framework can benefit patients who have difficulties in performing an MRI scan.

求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
15.60
自引率
9.90%
发文量
392
审稿时长
3 months
期刊介绍: The European Journal of Nuclear Medicine and Molecular Imaging serves as a platform for the exchange of clinical and scientific information within nuclear medicine and related professions. It welcomes international submissions from professionals involved in the functional, metabolic, and molecular investigation of diseases. The journal's coverage spans physics, dosimetry, radiation biology, radiochemistry, and pharmacy, providing high-quality peer review by experts in the field. Known for highly cited and downloaded articles, it ensures global visibility for research work and is part of the EJNMMI journal family.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信