{"title":"Unsupervised non-small cell lung cancer tumor segmentation using cycled generative adversarial network with similarity-based discriminator","authors":"Chengyijue Fang, Xiaoyang Li, Yidong Yang","doi":"10.1002/acm2.70107","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <h3> Background</h3>\n \n <p>Tumor segmentation is crucial for lung disease diagnosis and treatment. Most existing deep learning-based automatic segmentation methods rely on manually annotated data for network training.</p>\n </section>\n \n <section>\n \n <h3> Purpose</h3>\n \n <p>This study aims to develop an unsupervised tumor segmentation network smic-GAN by using a similarity-driven generative adversarial network trained with cycle strategy. The proposed method does not rely on any manual annotations and thus reduce the training data preparation workload.</p>\n </section>\n \n <section>\n \n <h3> Methods</h3>\n \n <p>A total of 609 CT scans of lung cancer patients are collected, of which 504 are used for training, 35 for validation, and 70 for testing. Smic-GAN is developed and trained to transform lung CT slices with tumors into synthetic images without tumors. Residual images are obtained by subtracting synthetic images from original CT slices. Thresholding, 3D median filtering, morphological erosion, and dilation operations are implemented to generate binary tumor masks from the residual images. Dice similarity, positive predictive value (PPV), sensitivity (SEN), 95% Hausdorff distance (HD95) and average surface distance (ASD) are used to evaluate the accuracy of tumor contouring.</p>\n </section>\n \n <section>\n \n <h3> Results</h3>\n \n <p>The smic-GAN method achieved a performance comparable to two supervised methods UNet and Incre-MRRN, and outperformed unsupervised cycle-GAN. The Dice value for smic-GAN is significantly better than cycle-GAN (74.5% <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$ \\pm $</annotation>\n </semantics></math> 11.2% vs. 69.1% <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$ \\pm $</annotation>\n </semantics></math> 16.0%, <i>p</i> < 0.05). The PPV for smic-GAN, UNet, and Incre-MRRN are 83.8% <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$ \\pm $</annotation>\n </semantics></math> 21.5%,75.1% <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$ \\pm $</annotation>\n </semantics></math> 19.7%, and 78.2% <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$ \\pm $</annotation>\n </semantics></math> 16.6% respectively. The HD95 are 10.3 <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$\\pm $</annotation>\n </semantics></math> 7.7, 14.5 <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$\\pm $</annotation>\n </semantics></math> 14.6 and 6.2 <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$\\pm $</annotation>\n </semantics></math> 4.0 mm, respectively. The ASD are 3.7 <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$\\pm $</annotation>\n </semantics></math> 2.7, 4.8 <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$\\pm $</annotation>\n </semantics></math> 3.8, and 2.4 <span></span><math>\n <semantics>\n <mo>±</mo>\n <annotation>$\\pm $</annotation>\n </semantics></math> 1.8 mm, respectively.</p>\n </section>\n \n <section>\n \n <h3> Conclusion</h3>\n \n <p>The proposed smic-GAN performs comparably to the existing supervised methods UNet and Incre-MRRN. It does not rely on any manual annotations and can reduce the workload of training data preparation. It can also provide a good start for manual annotation in the training of supervised networks.</p>\n </section>\n </div>","PeriodicalId":14989,"journal":{"name":"Journal of Applied Clinical Medical Physics","volume":"26 6","pages":""},"PeriodicalIF":2.0000,"publicationDate":"2025-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/acm2.70107","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Clinical Medical Physics","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/acm2.70107","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Tumor segmentation is crucial for lung disease diagnosis and treatment. Most existing deep learning-based automatic segmentation methods rely on manually annotated data for network training.
Purpose
This study aims to develop an unsupervised tumor segmentation network smic-GAN by using a similarity-driven generative adversarial network trained with cycle strategy. The proposed method does not rely on any manual annotations and thus reduce the training data preparation workload.
Methods
A total of 609 CT scans of lung cancer patients are collected, of which 504 are used for training, 35 for validation, and 70 for testing. Smic-GAN is developed and trained to transform lung CT slices with tumors into synthetic images without tumors. Residual images are obtained by subtracting synthetic images from original CT slices. Thresholding, 3D median filtering, morphological erosion, and dilation operations are implemented to generate binary tumor masks from the residual images. Dice similarity, positive predictive value (PPV), sensitivity (SEN), 95% Hausdorff distance (HD95) and average surface distance (ASD) are used to evaluate the accuracy of tumor contouring.
Results
The smic-GAN method achieved a performance comparable to two supervised methods UNet and Incre-MRRN, and outperformed unsupervised cycle-GAN. The Dice value for smic-GAN is significantly better than cycle-GAN (74.5% 11.2% vs. 69.1% 16.0%, p < 0.05). The PPV for smic-GAN, UNet, and Incre-MRRN are 83.8% 21.5%,75.1% 19.7%, and 78.2% 16.6% respectively. The HD95 are 10.3 7.7, 14.5 14.6 and 6.2 4.0 mm, respectively. The ASD are 3.7 2.7, 4.8 3.8, and 2.4 1.8 mm, respectively.
Conclusion
The proposed smic-GAN performs comparably to the existing supervised methods UNet and Incre-MRRN. It does not rely on any manual annotations and can reduce the workload of training data preparation. It can also provide a good start for manual annotation in the training of supervised networks.
期刊介绍:
Journal of Applied Clinical Medical Physics is an international Open Access publication dedicated to clinical medical physics. JACMP welcomes original contributions dealing with all aspects of medical physics from scientists working in the clinical medical physics around the world. JACMP accepts only online submission.
JACMP will publish:
-Original Contributions: Peer-reviewed, investigations that represent new and significant contributions to the field. Recommended word count: up to 7500.
-Review Articles: Reviews of major areas or sub-areas in the field of clinical medical physics. These articles may be of any length and are peer reviewed.
-Technical Notes: These should be no longer than 3000 words, including key references.
-Letters to the Editor: Comments on papers published in JACMP or on any other matters of interest to clinical medical physics. These should not be more than 1250 (including the literature) and their publication is only based on the decision of the editor, who occasionally asks experts on the merit of the contents.
-Book Reviews: The editorial office solicits Book Reviews.
-Announcements of Forthcoming Meetings: The Editor may provide notice of forthcoming meetings, course offerings, and other events relevant to clinical medical physics.
-Parallel Opposed Editorial: We welcome topics relevant to clinical practice and medical physics profession. The contents can be controversial debate or opposed aspects of an issue. One author argues for the position and the other against. Each side of the debate contains an opening statement up to 800 words, followed by a rebuttal up to 500 words. Readers interested in participating in this series should contact the moderator with a proposed title and a short description of the topic