CNMI-YOLO:数字病理学中的域自适应稳健有丝分裂识别。

IF 5.1 2区 医学 Q1 MEDICINE, RESEARCH & EXPERIMENTAL
{"title":"CNMI-YOLO:数字病理学中的域自适应稳健有丝分裂识别。","authors":"","doi":"10.1016/j.labinv.2024.102130","DOIUrl":null,"url":null,"abstract":"<div><p>In digital pathology, accurate mitosis detection in histopathological images is critical for cancer diagnosis and prognosis. However, this remains challenging due to the inherent variability in cell morphology and the domain shift problem. This study introduces ConvNext Mitosis Identification-You Only Look Once (CNMI-YOLO), a new 2-stage deep learning method that uses the YOLOv7 architecture for cell detection and the ConvNeXt architecture for cell classification. The goal is to improve the identification of mitosis in different types of cancers. We utilized the Mitosis Domain Generalization Challenge 2022 data set in the experiments to ensure the model’s robustness and success across various scanners, species, and cancer types. The CNMI-YOLO model demonstrates superior performance in accurately detecting mitotic cells, significantly outperforming existing models in terms of precision, recall, and F1 score. The CNMI-YOLO model achieved an F1 score of 0.795 on the Mitosis Domain Generalization Challenge 2022 and demonstrated robust generalization with F1 scores of 0.783 and 0.759 on the external melanoma and sarcoma test sets, respectively. Additionally, the study included ablation studies to evaluate various object detection and classification models, such as Faster-RCNN and Swin Transformer. Furthermore, we assessed the model’s robustness performance on unseen data, confirming its ability to generalize and its potential for real-world use in digital pathology, using soft tissue sarcoma and melanoma samples not included in the training data set.</p></div>","PeriodicalId":17930,"journal":{"name":"Laboratory Investigation","volume":null,"pages":null},"PeriodicalIF":5.1000,"publicationDate":"2024-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ConvNext Mitosis Identification—You Only Look Once (CNMI-YOLO): Domain Adaptive and Robust Mitosis Identification in Digital Pathology\",\"authors\":\"\",\"doi\":\"10.1016/j.labinv.2024.102130\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In digital pathology, accurate mitosis detection in histopathological images is critical for cancer diagnosis and prognosis. However, this remains challenging due to the inherent variability in cell morphology and the domain shift problem. This study introduces ConvNext Mitosis Identification-You Only Look Once (CNMI-YOLO), a new 2-stage deep learning method that uses the YOLOv7 architecture for cell detection and the ConvNeXt architecture for cell classification. The goal is to improve the identification of mitosis in different types of cancers. We utilized the Mitosis Domain Generalization Challenge 2022 data set in the experiments to ensure the model’s robustness and success across various scanners, species, and cancer types. The CNMI-YOLO model demonstrates superior performance in accurately detecting mitotic cells, significantly outperforming existing models in terms of precision, recall, and F1 score. The CNMI-YOLO model achieved an F1 score of 0.795 on the Mitosis Domain Generalization Challenge 2022 and demonstrated robust generalization with F1 scores of 0.783 and 0.759 on the external melanoma and sarcoma test sets, respectively. Additionally, the study included ablation studies to evaluate various object detection and classification models, such as Faster-RCNN and Swin Transformer. Furthermore, we assessed the model’s robustness performance on unseen data, confirming its ability to generalize and its potential for real-world use in digital pathology, using soft tissue sarcoma and melanoma samples not included in the training data set.</p></div>\",\"PeriodicalId\":17930,\"journal\":{\"name\":\"Laboratory Investigation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.1000,\"publicationDate\":\"2024-09-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Laboratory Investigation\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0023683724018087\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MEDICINE, RESEARCH & EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Laboratory Investigation","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0023683724018087","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MEDICINE, RESEARCH & EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

摘要

在数字病理学中,准确检测组织病理学图像中的有丝分裂对癌症诊断和预后至关重要。然而,由于细胞形态的固有变异性和域偏移问题,这项工作仍具有挑战性。本研究介绍了一种新的两阶段深度学习方法 CNMI-YOLO(ConvNext Mitosis Identification-YOLO),该方法使用 YOLOv7 架构进行细胞检测,使用 ConvNeXt 架构进行细胞分类。其目标是改进不同类型癌症中有丝分裂的识别。我们在实验中使用了 MIDOG 2022 数据集,以确保模型在各种扫描仪、物种和癌症类型中的稳健性和成功率。CNMI-YOLO 模型在准确检测有丝分裂细胞方面表现出色,在精确度、召回率和 F1 分数方面明显优于现有模型。CNMI-YOLO 模型在 MIDOG 2022 上的 F1 分数为 0.795,在外部黑色素瘤和肉瘤测试集上的 F1 分数分别为 0.783 和 0.759,显示出强大的通用性。此外,该研究还包括消融研究,以评估各种对象检测和分类模型,如 Faster R-CNN 和 Swin Transformer。此外,我们还利用训练数据集中未包含的软组织肉瘤和黑色素瘤样本,评估了该模型在未见数据上的鲁棒性能,证实了它的泛化能力及其在数字病理学中的实际应用潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
ConvNext Mitosis Identification—You Only Look Once (CNMI-YOLO): Domain Adaptive and Robust Mitosis Identification in Digital Pathology

In digital pathology, accurate mitosis detection in histopathological images is critical for cancer diagnosis and prognosis. However, this remains challenging due to the inherent variability in cell morphology and the domain shift problem. This study introduces ConvNext Mitosis Identification-You Only Look Once (CNMI-YOLO), a new 2-stage deep learning method that uses the YOLOv7 architecture for cell detection and the ConvNeXt architecture for cell classification. The goal is to improve the identification of mitosis in different types of cancers. We utilized the Mitosis Domain Generalization Challenge 2022 data set in the experiments to ensure the model’s robustness and success across various scanners, species, and cancer types. The CNMI-YOLO model demonstrates superior performance in accurately detecting mitotic cells, significantly outperforming existing models in terms of precision, recall, and F1 score. The CNMI-YOLO model achieved an F1 score of 0.795 on the Mitosis Domain Generalization Challenge 2022 and demonstrated robust generalization with F1 scores of 0.783 and 0.759 on the external melanoma and sarcoma test sets, respectively. Additionally, the study included ablation studies to evaluate various object detection and classification models, such as Faster-RCNN and Swin Transformer. Furthermore, we assessed the model’s robustness performance on unseen data, confirming its ability to generalize and its potential for real-world use in digital pathology, using soft tissue sarcoma and melanoma samples not included in the training data set.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Laboratory Investigation
Laboratory Investigation 医学-病理学
CiteScore
8.30
自引率
0.00%
发文量
125
审稿时长
2 months
期刊介绍: Laboratory Investigation is an international journal owned by the United States and Canadian Academy of Pathology. Laboratory Investigation offers prompt publication of high-quality original research in all biomedical disciplines relating to the understanding of human disease and the application of new methods to the diagnosis of disease. Both human and experimental studies are welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信