BCECNN:用于准确诊断乳腺癌的可解释的深度集成架构。

IF 3.8 3区 医学 Q2 MEDICAL INFORMATICS
Uçman Ergün, Tuğçe Çoban, İsmail Kayadibi
{"title":"BCECNN:用于准确诊断乳腺癌的可解释的深度集成架构。","authors":"Uçman Ergün, Tuğçe Çoban, İsmail Kayadibi","doi":"10.1186/s12911-025-03186-2","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Breast cancer remains one of the leading causes of cancer-related deaths globally, affecting both women and men. This study aims to develop a novel deep learning (DL)-based architecture, the Breast Cancer Ensemble Convolutional Neural Network (BCECNN), to enhance the diagnostic accuracy and interpretability of breast cancer detection systems.</p><p><strong>Methods: </strong>The BCECNN architecture incorporates two ensemble learning (EL) structures: Triple Ensemble CNN (TECNN) and Quintuple Ensemble CNN (QECNN). These ensemble models integrate the predictions of multiple CNN architectures-AlexNet, VGG16, ResNet-18, EfficientNetB0, and XceptionNet-using a majority voting mechanism. These models were trained using transfer learning (TL) and evaluated on five distinct sub-datasets generated from the Artificial Intelligence Smart Solution Laboratory (AISSLab) dataset, which consists of 266 mammography images labeled and validated by radiologists. To improve transparency and interpretability, Explainable Artificial Intelligence (XAI) techniques, including Gradient-weighted Class Activation Mapping (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME), were applied. Additionally, explainability was assessed through clinical evaluation by an experienced radiologist.</p><p><strong>Results: </strong>Experimental results demonstrated that the TECNN model-comprising AlexNet, VGG16, and EfficientNetB0-achieved the highest accuracy of 98.75% on the AISSLab-v2 dataset. The integration of XAI methods substantially enhanced the interpretability of the model, enabling clinicians to better understand and validate the model's decision-making process. Clinical evaluation confirmed that the XAI outputs aligned well with expert assessments, underscoring the practical utility of the model in a diagnostic setting.</p><p><strong>Conclusion: </strong>The BCECNN model presents a promising solution for improving both the accuracy and interpretability of breast cancer diagnostic systems. Unlike many previous studies that rely on single architectures or large datasets, BCECNN leverages the strengths of an ensemble of CNN models and performs robustly even with limited data. It integrates advanced XAI techniques-such as Grad-CAM and LIME-to provide visual justifications for model decisions, enhancing clinical interpretability. Moreover, the model was validated using AISSLab dataset, designed to reflect real-world diagnostic challenges. This combination of EL, interpretability, and robust performance on small yet clinically relevant data positions BCECNN as a novel and reliable decision support tool for AI-assisted breast cancer diagnostics.</p>","PeriodicalId":9340,"journal":{"name":"BMC Medical Informatics and Decision Making","volume":"25 1","pages":"374"},"PeriodicalIF":3.8000,"publicationDate":"2025-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"BCECNN: an explainable deep ensemble architecture for accurate diagnosis of breast cancer.\",\"authors\":\"Uçman Ergün, Tuğçe Çoban, İsmail Kayadibi\",\"doi\":\"10.1186/s12911-025-03186-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Breast cancer remains one of the leading causes of cancer-related deaths globally, affecting both women and men. This study aims to develop a novel deep learning (DL)-based architecture, the Breast Cancer Ensemble Convolutional Neural Network (BCECNN), to enhance the diagnostic accuracy and interpretability of breast cancer detection systems.</p><p><strong>Methods: </strong>The BCECNN architecture incorporates two ensemble learning (EL) structures: Triple Ensemble CNN (TECNN) and Quintuple Ensemble CNN (QECNN). These ensemble models integrate the predictions of multiple CNN architectures-AlexNet, VGG16, ResNet-18, EfficientNetB0, and XceptionNet-using a majority voting mechanism. These models were trained using transfer learning (TL) and evaluated on five distinct sub-datasets generated from the Artificial Intelligence Smart Solution Laboratory (AISSLab) dataset, which consists of 266 mammography images labeled and validated by radiologists. To improve transparency and interpretability, Explainable Artificial Intelligence (XAI) techniques, including Gradient-weighted Class Activation Mapping (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME), were applied. Additionally, explainability was assessed through clinical evaluation by an experienced radiologist.</p><p><strong>Results: </strong>Experimental results demonstrated that the TECNN model-comprising AlexNet, VGG16, and EfficientNetB0-achieved the highest accuracy of 98.75% on the AISSLab-v2 dataset. The integration of XAI methods substantially enhanced the interpretability of the model, enabling clinicians to better understand and validate the model's decision-making process. Clinical evaluation confirmed that the XAI outputs aligned well with expert assessments, underscoring the practical utility of the model in a diagnostic setting.</p><p><strong>Conclusion: </strong>The BCECNN model presents a promising solution for improving both the accuracy and interpretability of breast cancer diagnostic systems. Unlike many previous studies that rely on single architectures or large datasets, BCECNN leverages the strengths of an ensemble of CNN models and performs robustly even with limited data. It integrates advanced XAI techniques-such as Grad-CAM and LIME-to provide visual justifications for model decisions, enhancing clinical interpretability. Moreover, the model was validated using AISSLab dataset, designed to reflect real-world diagnostic challenges. This combination of EL, interpretability, and robust performance on small yet clinically relevant data positions BCECNN as a novel and reliable decision support tool for AI-assisted breast cancer diagnostics.</p>\",\"PeriodicalId\":9340,\"journal\":{\"name\":\"BMC Medical Informatics and Decision Making\",\"volume\":\"25 1\",\"pages\":\"374\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"BMC Medical Informatics and Decision Making\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1186/s12911-025-03186-2\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MEDICAL INFORMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"BMC Medical Informatics and Decision Making","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1186/s12911-025-03186-2","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MEDICAL INFORMATICS","Score":null,"Total":0}
引用次数: 0

摘要

背景:乳腺癌仍然是全球癌症相关死亡的主要原因之一,对女性和男性都有影响。本研究旨在开发一种新的基于深度学习(DL)的架构——乳腺癌集成卷积神经网络(BCECNN),以提高乳腺癌检测系统的诊断准确性和可解释性。方法:BCECNN体系结构包含两种集成学习(EL)结构:三重集成CNN (TECNN)和五元集成CNN (QECNN)。这些集成模型使用多数投票机制集成了多个CNN架构(alexnet、VGG16、ResNet-18、EfficientNetB0和xceptionnet)的预测。这些模型使用迁移学习(TL)进行训练,并在人工智能智能解决方案实验室(AISSLab)数据集生成的五个不同的子数据集上进行评估,该数据集由放射科医生标记和验证的266张乳房x光检查图像组成。为了提高透明度和可解释性,应用了可解释人工智能(XAI)技术,包括梯度加权类激活映射(Grad-CAM)和局部可解释模型不可知论解释(LIME)。此外,由经验丰富的放射科医生通过临床评估评估可解释性。结果:实验结果表明,由AlexNet、VGG16和efficientnetb0组成的TECNN模型在AISSLab-v2数据集上达到了98.75%的最高准确率。XAI方法的集成大大增强了模型的可解释性,使临床医生能够更好地理解和验证模型的决策过程。临床评估证实,XAI输出与专家评估非常一致,强调了该模型在诊断环境中的实际效用。结论:BCECNN模型为提高乳腺癌诊断系统的准确性和可解释性提供了一个有希望的解决方案。与之前许多依赖单一架构或大型数据集的研究不同,BCECNN利用了CNN模型集合的优势,即使在有限的数据下也能表现出色。它集成了先进的XAI技术,如Grad-CAM和lime,为模型决策提供视觉证明,提高了临床可解释性。此外,该模型使用AISSLab数据集进行了验证,旨在反映现实世界的诊断挑战。这种EL、可解释性和在少量临床相关数据上的稳健表现的结合,使BCECNN成为人工智能辅助乳腺癌诊断的一种新颖可靠的决策支持工具。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
BCECNN: an explainable deep ensemble architecture for accurate diagnosis of breast cancer.

Background: Breast cancer remains one of the leading causes of cancer-related deaths globally, affecting both women and men. This study aims to develop a novel deep learning (DL)-based architecture, the Breast Cancer Ensemble Convolutional Neural Network (BCECNN), to enhance the diagnostic accuracy and interpretability of breast cancer detection systems.

Methods: The BCECNN architecture incorporates two ensemble learning (EL) structures: Triple Ensemble CNN (TECNN) and Quintuple Ensemble CNN (QECNN). These ensemble models integrate the predictions of multiple CNN architectures-AlexNet, VGG16, ResNet-18, EfficientNetB0, and XceptionNet-using a majority voting mechanism. These models were trained using transfer learning (TL) and evaluated on five distinct sub-datasets generated from the Artificial Intelligence Smart Solution Laboratory (AISSLab) dataset, which consists of 266 mammography images labeled and validated by radiologists. To improve transparency and interpretability, Explainable Artificial Intelligence (XAI) techniques, including Gradient-weighted Class Activation Mapping (Grad-CAM) and Local Interpretable Model-Agnostic Explanations (LIME), were applied. Additionally, explainability was assessed through clinical evaluation by an experienced radiologist.

Results: Experimental results demonstrated that the TECNN model-comprising AlexNet, VGG16, and EfficientNetB0-achieved the highest accuracy of 98.75% on the AISSLab-v2 dataset. The integration of XAI methods substantially enhanced the interpretability of the model, enabling clinicians to better understand and validate the model's decision-making process. Clinical evaluation confirmed that the XAI outputs aligned well with expert assessments, underscoring the practical utility of the model in a diagnostic setting.

Conclusion: The BCECNN model presents a promising solution for improving both the accuracy and interpretability of breast cancer diagnostic systems. Unlike many previous studies that rely on single architectures or large datasets, BCECNN leverages the strengths of an ensemble of CNN models and performs robustly even with limited data. It integrates advanced XAI techniques-such as Grad-CAM and LIME-to provide visual justifications for model decisions, enhancing clinical interpretability. Moreover, the model was validated using AISSLab dataset, designed to reflect real-world diagnostic challenges. This combination of EL, interpretability, and robust performance on small yet clinically relevant data positions BCECNN as a novel and reliable decision support tool for AI-assisted breast cancer diagnostics.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.20
自引率
5.70%
发文量
297
审稿时长
1 months
期刊介绍: BMC Medical Informatics and Decision Making is an open access journal publishing original peer-reviewed research articles in relation to the design, development, implementation, use, and evaluation of health information technologies and decision-making for human health.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信