An Approach to Segment Nuclei and Cytoplasm in Lung Cancer Brightfield Images Using Hybrid Swin-Unet Transformer

IF 1.6 4区 医学 Q4 ENGINEERING, BIOMEDICAL
Sreelekshmi Palliyil Sreekumar, Rohini Palanisamy, Ramakrishnan Swaminathan
{"title":"An Approach to Segment Nuclei and Cytoplasm in Lung Cancer Brightfield Images Using Hybrid Swin-Unet Transformer","authors":"Sreelekshmi Palliyil Sreekumar, Rohini Palanisamy, Ramakrishnan Swaminathan","doi":"10.1007/s40846-024-00873-9","DOIUrl":null,"url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Purpose</h3><p>Segmentation of nuclei and cytoplasm in cellular images is essential for estimating the prognosis of lung cancer disease. The detection of these organelles in the unstained brightfield microscopic images is challenging due to poor contrast and lack of separation of structures with irregular morphology. This work aims to carry out semantic segmentation of nuclei and cytoplasm in lung cancer brightfield images using the Swin-Unet Transformer.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>For this study, publicly available brightfield images of lung cancer cells are pre-processed and fed to the Swin-Unet for semantic segmentation. Model specific hyperparameters are identified after detailed analysis and the segmentation performance is validated using standard evaluation metrics.</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>The hyperparameter analysis provides the selection of optimum parameters as focal loss, learning rate of 0.0001, Adam optimizer, and Swin Transformer patch size of 4. The obtained results show that with these parameters, the Swin-Unet Transformer accurately segmented the nuclei and cytoplasm in the brightfield images with pixel-F1 scores of 90.71% and 79.29% respectively.</p><h3 data-test=\"abstract-sub-heading\">Conclusion</h3><p>It is observed that the model could identify nuclei and cytoplasm with varied morphologies. The detection of cytoplasm with weak and subtle edge details indicates the effectiveness of shifted window based self attention mechanism of Swin-Unet in capturing the global and long distance pixel interactions in the brightfield images. Thus, the adopted methodology in this study can be employed for the precise segmentation of nuclei and cytoplasm for assessing the malignancy of lung cancer disease.</p>","PeriodicalId":50133,"journal":{"name":"Journal of Medical and Biological Engineering","volume":"23 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Medical and Biological Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s40846-024-00873-9","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose

Segmentation of nuclei and cytoplasm in cellular images is essential for estimating the prognosis of lung cancer disease. The detection of these organelles in the unstained brightfield microscopic images is challenging due to poor contrast and lack of separation of structures with irregular morphology. This work aims to carry out semantic segmentation of nuclei and cytoplasm in lung cancer brightfield images using the Swin-Unet Transformer.

Methods

For this study, publicly available brightfield images of lung cancer cells are pre-processed and fed to the Swin-Unet for semantic segmentation. Model specific hyperparameters are identified after detailed analysis and the segmentation performance is validated using standard evaluation metrics.

Results

The hyperparameter analysis provides the selection of optimum parameters as focal loss, learning rate of 0.0001, Adam optimizer, and Swin Transformer patch size of 4. The obtained results show that with these parameters, the Swin-Unet Transformer accurately segmented the nuclei and cytoplasm in the brightfield images with pixel-F1 scores of 90.71% and 79.29% respectively.

Conclusion

It is observed that the model could identify nuclei and cytoplasm with varied morphologies. The detection of cytoplasm with weak and subtle edge details indicates the effectiveness of shifted window based self attention mechanism of Swin-Unet in capturing the global and long distance pixel interactions in the brightfield images. Thus, the adopted methodology in this study can be employed for the precise segmentation of nuclei and cytoplasm for assessing the malignancy of lung cancer disease.

Abstract Image

利用混合 Swin-Unet 变换器分割肺癌明视野图像中的细胞核和细胞质的方法
目的细胞图像中细胞核和细胞质的分离对于估计肺癌疾病的预后至关重要。由于对比度差、形态不规则的结构无法分离,在未染色的明视野显微图像中检测这些细胞器具有挑战性。本研究旨在使用 Swin-Unet Transformer 对肺癌明视野图像中的细胞核和细胞质进行语义分割。经过详细分析,确定了特定模型的超参数,并使用标准评估指标验证了分割性能。结果超参数分析提供了最佳参数选择,如焦点损失、学习率 0.0001、Adam 优化器和 Swin 变换器补丁大小为 4。结果表明,在这些参数的作用下,Swin-Unet 变换器准确地分割了明视野图像中的细胞核和细胞质,像素-F1 分数分别为 90.71% 和 79.29%。对具有微弱和细微边缘细节的细胞质的检测表明,Swin-Unet 基于移位窗口的自我注意机制在捕捉明视野图像中的全局和长距离像素相互作用方面非常有效。因此,本研究采用的方法可用于精确分割细胞核和细胞质,以评估肺癌疾病的恶性程度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
4.30
自引率
5.00%
发文量
81
审稿时长
3 months
期刊介绍: The purpose of Journal of Medical and Biological Engineering, JMBE, is committed to encouraging and providing the standard of biomedical engineering. The journal is devoted to publishing papers related to clinical engineering, biomedical signals, medical imaging, bio-informatics, tissue engineering, and so on. Other than the above articles, any contributions regarding hot issues and technological developments that help reach the purpose are also included.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信