Francesca Camagni, Anestis Nakas, Giovanni Parrella, Alessandro Vai, Silvia Molinelli, Viviana Vitolo, Amelia Barcellini, Agnieszka Chalaszczyk, Sara Imparato, Andrea Pella, Ester Orlandi, Guido Baroni, Marco Riboldi, Chiara Paganelli
{"title":"Generation of multimodal realistic computational phantoms as a test-bed for validating deep learning-based cross-modality synthesis techniques.","authors":"Francesca Camagni, Anestis Nakas, Giovanni Parrella, Alessandro Vai, Silvia Molinelli, Viviana Vitolo, Amelia Barcellini, Agnieszka Chalaszczyk, Sara Imparato, Andrea Pella, Ester Orlandi, Guido Baroni, Marco Riboldi, Chiara Paganelli","doi":"10.1007/s11517-025-03437-4","DOIUrl":null,"url":null,"abstract":"<p><p>The validation of multimodal deep learning models for medical image translation is limited by the lack of high-quality, paired datasets. We propose a novel framework that leverages computational phantoms to generate realistic CT and MRI images, enabling reliable ground-truth datasets for robust validation of artificial intelligence (AI) methods that generate synthetic CT (sCT) from MRI, specifically for radiotherapy applications. Two CycleGANs (cycle-consistent generative adversarial networks) were trained to transfer the imaging style of real patients onto CT and MRI phantoms, producing synthetic data with realistic textures and continuous intensity distributions. These data were evaluated through paired assessments with original phantoms, unpaired comparisons with patient scans, and dosimetric analysis using patient-specific radiotherapy treatment plans. Additional external validation was performed on public CT datasets to assess the generalizability to unseen data. The resulting, paired CT/MRI phantoms were used to validate a GAN-based model for sCT generation from abdominal MRI in particle therapy, available in the literature. Results showed strong anatomical consistency with original phantoms, high histogram correlation with patient images (HistCC = 0.998 ± 0.001 for MRI, HistCC = 0.97 ± 0.04 for CT), and dosimetric accuracy comparable to real data. The novelty of this work lies in using generated phantoms as validation data for deep learning-based cross-modality synthesis techniques.</p>","PeriodicalId":49840,"journal":{"name":"Medical & Biological Engineering & Computing","volume":" ","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2025-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical & Biological Engineering & Computing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11517-025-03437-4","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
The validation of multimodal deep learning models for medical image translation is limited by the lack of high-quality, paired datasets. We propose a novel framework that leverages computational phantoms to generate realistic CT and MRI images, enabling reliable ground-truth datasets for robust validation of artificial intelligence (AI) methods that generate synthetic CT (sCT) from MRI, specifically for radiotherapy applications. Two CycleGANs (cycle-consistent generative adversarial networks) were trained to transfer the imaging style of real patients onto CT and MRI phantoms, producing synthetic data with realistic textures and continuous intensity distributions. These data were evaluated through paired assessments with original phantoms, unpaired comparisons with patient scans, and dosimetric analysis using patient-specific radiotherapy treatment plans. Additional external validation was performed on public CT datasets to assess the generalizability to unseen data. The resulting, paired CT/MRI phantoms were used to validate a GAN-based model for sCT generation from abdominal MRI in particle therapy, available in the literature. Results showed strong anatomical consistency with original phantoms, high histogram correlation with patient images (HistCC = 0.998 ± 0.001 for MRI, HistCC = 0.97 ± 0.04 for CT), and dosimetric accuracy comparable to real data. The novelty of this work lies in using generated phantoms as validation data for deep learning-based cross-modality synthesis techniques.
期刊介绍:
Founded in 1963, Medical & Biological Engineering & Computing (MBEC) continues to serve the biomedical engineering community, covering the entire spectrum of biomedical and clinical engineering. The journal presents exciting and vital experimental and theoretical developments in biomedical science and technology, and reports on advances in computer-based methodologies in these multidisciplinary subjects. The journal also incorporates new and evolving technologies including cellular engineering and molecular imaging.
MBEC publishes original research articles as well as reviews and technical notes. Its Rapid Communications category focuses on material of immediate value to the readership, while the Controversies section provides a forum to exchange views on selected issues, stimulating a vigorous and informed debate in this exciting and high profile field.
MBEC is an official journal of the International Federation of Medical and Biological Engineering (IFMBE).