Mudassar Ali , Haoji Hu , Tong Wu , Maryam Mansoor , Qiong Luo , Weizeng Zheng , Neng Jin
{"title":"Segmentation of MRI tumors and pelvic anatomy via cGAN-synthesized data and attention-enhanced U-Net","authors":"Mudassar Ali , Haoji Hu , Tong Wu , Maryam Mansoor , Qiong Luo , Weizeng Zheng , Neng Jin","doi":"10.1016/j.patrec.2024.11.003","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate tumor segmentation within MRI images is of great importance for both diagnosis and treatment; however, in many cases, sufficient annotated datasets may not be available. This paper develops a novel approach to the medical image segmentation of tumors in the brain, liver, and pelvic regions within MRI images, by combining an attention-enhanced U-Net model with a cGAN. We introduce three key novelties: a patch discriminator in the cGAN to enhance realism of generated images, attention mechanisms in the U-Net to enhance the accuracy of segmentation, and finally an application to pelvic MRI segmentation, which has seen little exploration. Our method addresses the issue of limited availability of annotated data by generating realistic synthetic images to augment the process of training. Our experimental results on brain, liver, and pelvic MRI datasets show that our approach outperforms the state-of-the-art methods with a Dice Coefficient of 98.61 % for brain MRI, 88.60 % for liver MRI, and 91.93 % for pelvic MRI. We can also observe great increases in the Hausdorff Distance, at especially complex anatomical regions such as tumor boundaries. The proposed combination of synthetic data creation and novel segmentation techniques opens new perspectives for robust medical image segmentation.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"187 ","pages":"Pages 100-106"},"PeriodicalIF":3.9000,"publicationDate":"2024-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016786552400309X","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate tumor segmentation within MRI images is of great importance for both diagnosis and treatment; however, in many cases, sufficient annotated datasets may not be available. This paper develops a novel approach to the medical image segmentation of tumors in the brain, liver, and pelvic regions within MRI images, by combining an attention-enhanced U-Net model with a cGAN. We introduce three key novelties: a patch discriminator in the cGAN to enhance realism of generated images, attention mechanisms in the U-Net to enhance the accuracy of segmentation, and finally an application to pelvic MRI segmentation, which has seen little exploration. Our method addresses the issue of limited availability of annotated data by generating realistic synthetic images to augment the process of training. Our experimental results on brain, liver, and pelvic MRI datasets show that our approach outperforms the state-of-the-art methods with a Dice Coefficient of 98.61 % for brain MRI, 88.60 % for liver MRI, and 91.93 % for pelvic MRI. We can also observe great increases in the Hausdorff Distance, at especially complex anatomical regions such as tumor boundaries. The proposed combination of synthetic data creation and novel segmentation techniques opens new perspectives for robust medical image segmentation.
期刊介绍:
Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition.
Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.