ECMTrans-net: Multi-class Segmentation Network Based on Tumor Tissue in Endometrial Cancer Pathology Images.

IF 4.7 2区 医学 Q1 PATHOLOGY
Tong Yang, Ping Li, Bo Liu, Yuchun Lv, Dage Fan, Yuling Fan, Peizhong Liu, Yaping Ni
{"title":"ECMTrans-net: Multi-class Segmentation Network Based on Tumor Tissue in Endometrial Cancer Pathology Images.","authors":"Tong Yang, Ping Li, Bo Liu, Yuchun Lv, Dage Fan, Yuling Fan, Peizhong Liu, Yaping Ni","doi":"10.1016/j.ajpath.2024.10.008","DOIUrl":null,"url":null,"abstract":"<p><p>Endometrial cancer has the second highest incidence of malignant tumors in the female reproductive system, and accurate and efficient endometrial cancer pathology image analysis is one of the important research components of computer-aided diagnosis. However, endometrial cancer pathologic images have the challenges of smaller solid tumors, lesion areas varying in morphology, and difficulty distinguishing solid and non-solid tumors, which would impact the accuracy of subsequent pathological analyses. Therefore, an Endometrial Cancer Multi-class Transformer Network (ECMTrans-net) is proposed to improve the segmentation accuracy of endometrial cancer pathology images. On the one hand, an ECM-Attention is proposed, which can sequentially infer attention maps along three separate dimensions: channel, local spatial, and global spatial, and multiply the attention maps and the input feature map for adaptive feature refinement, solving the problems of the small size of solid tumors and similar characteristics of solid tumors to non-solid tumors and further improving the accuracy of segmentation of solid tumors. On the other hand, an ECM-Transformer is proposed, which can fuse multi-class feature information and dynamically adjust the receptive field, solving the issue of complex tumor features. Experiments on the solid tumor endometrial cancer pathological (ST-ECP) dataset show that the ECMTrans-net performs superior to state-of-the-art image segmentation methods, and the average values of Accuracy, MIoU, Precision, and Dice were 0.952, 0.927, 0.931 and 0.901, respectively.</p>","PeriodicalId":7623,"journal":{"name":"American Journal of Pathology","volume":null,"pages":null},"PeriodicalIF":4.7000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Journal of Pathology","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1016/j.ajpath.2024.10.008","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PATHOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Endometrial cancer has the second highest incidence of malignant tumors in the female reproductive system, and accurate and efficient endometrial cancer pathology image analysis is one of the important research components of computer-aided diagnosis. However, endometrial cancer pathologic images have the challenges of smaller solid tumors, lesion areas varying in morphology, and difficulty distinguishing solid and non-solid tumors, which would impact the accuracy of subsequent pathological analyses. Therefore, an Endometrial Cancer Multi-class Transformer Network (ECMTrans-net) is proposed to improve the segmentation accuracy of endometrial cancer pathology images. On the one hand, an ECM-Attention is proposed, which can sequentially infer attention maps along three separate dimensions: channel, local spatial, and global spatial, and multiply the attention maps and the input feature map for adaptive feature refinement, solving the problems of the small size of solid tumors and similar characteristics of solid tumors to non-solid tumors and further improving the accuracy of segmentation of solid tumors. On the other hand, an ECM-Transformer is proposed, which can fuse multi-class feature information and dynamically adjust the receptive field, solving the issue of complex tumor features. Experiments on the solid tumor endometrial cancer pathological (ST-ECP) dataset show that the ECMTrans-net performs superior to state-of-the-art image segmentation methods, and the average values of Accuracy, MIoU, Precision, and Dice were 0.952, 0.927, 0.931 and 0.901, respectively.

ECMTrans-net:基于子宫内膜癌病理图像中肿瘤组织的多类分割网络
子宫内膜癌是女性生殖系统中发病率第二高的恶性肿瘤,准确高效的子宫内膜癌病理图像分析是计算机辅助诊断的重要研究内容之一。然而,子宫内膜癌病理图像存在实体瘤较小、病灶区域形态各异、实体瘤与非实体瘤难以区分等难题,影响了后续病理分析的准确性。因此,本文提出了子宫内膜癌多类变换器网络(ECMTrans-net),以提高子宫内膜癌病理图像的分割精度。一方面,提出的 ECM-Attention 可以从通道、局部空间和全局空间三个不同维度依次推断注意力图,并将注意力图与输入特征图相乘进行自适应特征细化,解决了实体瘤体积小、实体瘤与非实体瘤特征相似的问题,进一步提高了实体瘤的分割精度。另一方面,提出了一种 ECM 变换器,它能融合多类特征信息并动态调整感受野,解决了复杂肿瘤特征的问题。在实体瘤子宫内膜癌病理(ST-ECP)数据集上的实验表明,ECMTrans-net 的表现优于最先进的图像分割方法,其准确度、MIoU、精确度和 Dice 的平均值分别为 0.952、0.927、0.931 和 0.901。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
11.40
自引率
0.00%
发文量
178
审稿时长
30 days
期刊介绍: The American Journal of Pathology, official journal of the American Society for Investigative Pathology, published by Elsevier, Inc., seeks high-quality original research reports, reviews, and commentaries related to the molecular and cellular basis of disease. The editors will consider basic, translational, and clinical investigations that directly address mechanisms of pathogenesis or provide a foundation for future mechanistic inquiries. Examples of such foundational investigations include data mining, identification of biomarkers, molecular pathology, and discovery research. Foundational studies that incorporate deep learning and artificial intelligence are also welcome. High priority is given to studies of human disease and relevant experimental models using molecular, cellular, and organismal approaches.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信