Shweta Taneja, Bhawna Suri, Aman Roy, Ashish Chowdhry, H. kumar, Kautuk Dwivedi
{"title":"PUPC-GANs: A Novel Image Conversion Model using Modified CycleGANs in Healthcare","authors":"Shweta Taneja, Bhawna Suri, Aman Roy, Ashish Chowdhry, H. kumar, Kautuk Dwivedi","doi":"10.2174/2666255816666230330100005","DOIUrl":null,"url":null,"abstract":"\n\nMagnetic resonance imaging (MRI) and computed tomography (CT) both have their areas of specialty in the medical imaging world. MRI is considered to be a safer modality as it exploits the magnetic properties of the hydrogen nucleus. Whereas a CT scan uses multiple X-rays, which is known to contribute to carcinogenesis and is associated with affecting the patient's health.\n\n\n\nIn scenarios such as Radiation Therapy, where both MRI and CT are required for medical treatment, a unique approach to getting both scans would be to obtain MRI and generate a CT scan from it.\n\n\n\nIn scenarios, such as radiation therapy, where both MRI and CT are required for medical treatment, a unique approach to getting both scans would be to obtain MRI and generate a CT scan from it. Current deep learning methods for MRI to CT synthesis purely use either paired data or unpaired data. Models trained with paired data suffer due to a lack of availability of well-aligned data.\n\n\n\nTraining with unpaired data might generate visually realistic images, although it still does not guarantee good accuracy. To overcome this, we proposed a new model called PUPC-GANs (Paired Unpaired CycleGANs), based on CycleGANs (Cycle-Consistent Adversarial Networks).\n\n\n\nTraining with unpaired data might generate visually realistic images, although it still does not guarantee good accuracy. To overcome this, we propose a new model called PUPC-GANs (Paired Unpaired CycleGANs), based on CycleGANs (Cycle-Consistent Adversarial Networks).\n\n\n\nThis model is capable of learning transformations utilizing both paired and unpaired data. To support this, a paired loss is introduced. Comparing MAE, MSE, NRMSE, PSNR, and SSIM metrics, PUPC-GANs outperform CycleGANs.\n\n\n\nDespite MRI and CT having different areas of application, there are use cases like Radiation Therapy, where both of them are required. A feasible approach to obtaining these images is to synthesize CT from MRI scans. Current methods fail to use paired data along with abundantly available unpaired data. The proposed model (PUPC-GANs) is able to utilize the presence of paired data during the training phase. This ability in combination with the conventional model of CycleGANs produces significant improvement in results as compared to training only with unpaired data. When comparing the two models using loss metrics, which include MAE, MSE, NRMSE, and PSNR, the proposed model outperforms CycleGANs. An SSIM of 0.8 is achieved, which is superior to the one obtained by CycleGANs. The proposed model produces comparable results on visual examination.\n","PeriodicalId":36514,"journal":{"name":"Recent Advances in Computer Science and Communications","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Recent Advances in Computer Science and Communications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2174/2666255816666230330100005","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0
Abstract
Magnetic resonance imaging (MRI) and computed tomography (CT) both have their areas of specialty in the medical imaging world. MRI is considered to be a safer modality as it exploits the magnetic properties of the hydrogen nucleus. Whereas a CT scan uses multiple X-rays, which is known to contribute to carcinogenesis and is associated with affecting the patient's health.
In scenarios such as Radiation Therapy, where both MRI and CT are required for medical treatment, a unique approach to getting both scans would be to obtain MRI and generate a CT scan from it.
In scenarios, such as radiation therapy, where both MRI and CT are required for medical treatment, a unique approach to getting both scans would be to obtain MRI and generate a CT scan from it. Current deep learning methods for MRI to CT synthesis purely use either paired data or unpaired data. Models trained with paired data suffer due to a lack of availability of well-aligned data.
Training with unpaired data might generate visually realistic images, although it still does not guarantee good accuracy. To overcome this, we proposed a new model called PUPC-GANs (Paired Unpaired CycleGANs), based on CycleGANs (Cycle-Consistent Adversarial Networks).
Training with unpaired data might generate visually realistic images, although it still does not guarantee good accuracy. To overcome this, we propose a new model called PUPC-GANs (Paired Unpaired CycleGANs), based on CycleGANs (Cycle-Consistent Adversarial Networks).
This model is capable of learning transformations utilizing both paired and unpaired data. To support this, a paired loss is introduced. Comparing MAE, MSE, NRMSE, PSNR, and SSIM metrics, PUPC-GANs outperform CycleGANs.
Despite MRI and CT having different areas of application, there are use cases like Radiation Therapy, where both of them are required. A feasible approach to obtaining these images is to synthesize CT from MRI scans. Current methods fail to use paired data along with abundantly available unpaired data. The proposed model (PUPC-GANs) is able to utilize the presence of paired data during the training phase. This ability in combination with the conventional model of CycleGANs produces significant improvement in results as compared to training only with unpaired data. When comparing the two models using loss metrics, which include MAE, MSE, NRMSE, and PSNR, the proposed model outperforms CycleGANs. An SSIM of 0.8 is achieved, which is superior to the one obtained by CycleGANs. The proposed model produces comparable results on visual examination.