Youngbeom Seo, Heesung Yang, Eunjung Kong, Vivek Sanker, Atman Desai, Jungwon Lee, So Hee Park, You Seon Song, Ikchan Jeon
{"title":"Cross-modality image-to-image translation from MR to synthetic <sup>18</sup>F-FDOPA PET/MR fusion images using conditional GAN in brain cancer.","authors":"Youngbeom Seo, Heesung Yang, Eunjung Kong, Vivek Sanker, Atman Desai, Jungwon Lee, So Hee Park, You Seon Song, Ikchan Jeon","doi":"10.1007/s00234-025-03704-z","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>This study aims to identify the possibility of cross-modality image-to-image translation from magnetic resonance (MR) to synthetic positron emission tomography (PET)/MR fusion images using conditional generative adversarial networks (CGAN).</p><p><strong>Methods: </strong>Retrospective study was conducted involving 32 simultaneous 6-[<sup>18</sup>F]-fluoro-L-3,4-dihydroxyphenylalanine (<sup>18</sup>F-FDOPA) PET/MR imaging examinations from 27 patients diagnosed with brain cancer. We applied paired axial T1-weighted contrast MR (T1C) and PET/T1C fusion images to translate from T1C to synthetic PET/T1C fusion images using the Pix2Pix algorithm of CGAN. To access the image similarity between real and synthetic PET/T1C fusion images, we calculated correlation coefficients for the maximum/mean tumor-to-background ratio (TBR<sub>max/mean</sub>) and quantitative analyses were performed using peak signal-to-noise ratio (PSNR), mean squared error (MSE), structural similarity index (SSIM), and feature similarity index measure (FSIM).</p><p><strong>Results: </strong>Total 2167 pairs of T1C and PET/T1C fusion images were obtained, which were randomly assigned to training and test datasets in 9:1 ratio (1950 and 217 pairs), and training data were further divided into training and validation datasets in 4:1 ratio (1560 and 390 pairs). The correlation coefficients were 0.706 (CI:0.533-0.822) for TBR<sub>max</sub> (p < 0.001) and 0.901 (CI:0.831-0.943) for TBR<sub>mean</sub> (p < 0.001). The quantitative analyses were PSNR of 31.075 ± 3.976, MSE of 0.001 ± 0.001, SSIM of 0.868 ± 0.079, and FSIM of 0.922 ± 0.044, respectively.</p><p><strong>Conclusion: </strong>CGAN based on simultaneous <sup>18</sup>F-FDOPA PET/MR imaging data demonstrated the potential for cross-modality image-to-image translation from T1C to PET/T1C fusion images, though limitations in small dataset and lack of external validation requiring further research.</p>","PeriodicalId":19422,"journal":{"name":"Neuroradiology","volume":" ","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2025-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuroradiology","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1007/s00234-025-03704-z","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CLINICAL NEUROLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Objective: This study aims to identify the possibility of cross-modality image-to-image translation from magnetic resonance (MR) to synthetic positron emission tomography (PET)/MR fusion images using conditional generative adversarial networks (CGAN).
Methods: Retrospective study was conducted involving 32 simultaneous 6-[18F]-fluoro-L-3,4-dihydroxyphenylalanine (18F-FDOPA) PET/MR imaging examinations from 27 patients diagnosed with brain cancer. We applied paired axial T1-weighted contrast MR (T1C) and PET/T1C fusion images to translate from T1C to synthetic PET/T1C fusion images using the Pix2Pix algorithm of CGAN. To access the image similarity between real and synthetic PET/T1C fusion images, we calculated correlation coefficients for the maximum/mean tumor-to-background ratio (TBRmax/mean) and quantitative analyses were performed using peak signal-to-noise ratio (PSNR), mean squared error (MSE), structural similarity index (SSIM), and feature similarity index measure (FSIM).
Results: Total 2167 pairs of T1C and PET/T1C fusion images were obtained, which were randomly assigned to training and test datasets in 9:1 ratio (1950 and 217 pairs), and training data were further divided into training and validation datasets in 4:1 ratio (1560 and 390 pairs). The correlation coefficients were 0.706 (CI:0.533-0.822) for TBRmax (p < 0.001) and 0.901 (CI:0.831-0.943) for TBRmean (p < 0.001). The quantitative analyses were PSNR of 31.075 ± 3.976, MSE of 0.001 ± 0.001, SSIM of 0.868 ± 0.079, and FSIM of 0.922 ± 0.044, respectively.
Conclusion: CGAN based on simultaneous 18F-FDOPA PET/MR imaging data demonstrated the potential for cross-modality image-to-image translation from T1C to PET/T1C fusion images, though limitations in small dataset and lack of external validation requiring further research.
期刊介绍:
Neuroradiology aims to provide state-of-the-art medical and scientific information in the fields of Neuroradiology, Neurosciences, Neurology, Psychiatry, Neurosurgery, and related medical specialities. Neuroradiology as the official Journal of the European Society of Neuroradiology receives submissions from all parts of the world and publishes peer-reviewed original research, comprehensive reviews, educational papers, opinion papers, and short reports on exceptional clinical observations and new technical developments in the field of Neuroimaging and Neurointervention. The journal has subsections for Diagnostic and Interventional Neuroradiology, Advanced Neuroimaging, Paediatric Neuroradiology, Head-Neck-ENT Radiology, Spine Neuroradiology, and for submissions from Japan. Neuroradiology aims to provide new knowledge about and insights into the function and pathology of the human nervous system that may help to better diagnose and treat nervous system diseases. Neuroradiology is a member of the Committee on Publication Ethics (COPE) and follows the COPE core practices. Neuroradiology prefers articles that are free of bias, self-critical regarding limitations, transparent and clear in describing study participants, methods, and statistics, and short in presenting results. Before peer-review all submissions are automatically checked by iThenticate to assess for potential overlap in prior publication.