Marwa Naas, Hiba Mzoughi, Ines Njeh, Mohamed Ben Slima
{"title":"Deep learning based computer aided diagnosis (CAD) tool supported by explainable artificial intelligence for breast cancer exploration","authors":"Marwa Naas, Hiba Mzoughi, Ines Njeh, Mohamed Ben Slima","doi":"10.1007/s10489-025-06561-8","DOIUrl":null,"url":null,"abstract":"<div><p>Breast cancer (BC) is a leading cause of death among women, with breast ultrasound (BUS) commonly used for early detection. However, BUS images are often affected by speckle noise, low tissue contrast, and artifacts, which can compromise image analysis tasks like segmentation and classification. Nowadays, Deep Learning (DL)-based Computer-Aided Diagnosis (CAD) systems could significantly enhance clinical diagnosis by leveraging self-learning capabilities to extract a sophisticated hierarchy of features from images. However, DL models often lack transparency in their internal decision-making processes, which is critical for sensitive applications like breast imaging. To address this, Explainable Artificial Intelligence (XAI) has emerged as a key approach to make DL models more transparent and interpretable for clinicians. This paper presents an efficient and fully automated DL-based CAD tool enhanced by XAI techniques for the precise exploration and diagnosis of BC using ultrasound images. The proposed CAD involves four key-steps: preprocessing, segmentation, XAI-based explainability, and feature extraction. In the preprocessing phase, an Autoencoder-based architecture is explored to effectively reduce speckle noise. For segmentation, our approach introduces an optimized architecture inspired by the DeepLabV3 + model. To ensure transparency in the model's predictions, Gradient-weighted Class Activation Mapping (Grad-CAM) is employed to provide interpretable insights into the decisions made by the deep neural network. Lastly, relevant features are extracted using the Gray-Level Co-occurrence Matrix (GLCM) technique. The proposed approach was rigourously evaluated on two publicly available benchmark datasets. For the first dataset (A), the evaluation metrics achieved were as follows: Dice coefficient (0.979), accuracy (0.935), intersection over union (0.955), precision (0.984), F1 score (0.981), and recall (0.980). Similarly, for the second dataset (B), the model showed notable improvements, achieving a Dice coefficient (0.981), accuracy (0.974), intersection over union (0.963), precision (0.986), F1 score (0.985), and recall (0.983).These results highlight the exceptional performance of the optimized DeepLabV3 + model in segmentation tasks, outperforming both U-Net and ResidualUnet architectures.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 7","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-025-06561-8","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Breast cancer (BC) is a leading cause of death among women, with breast ultrasound (BUS) commonly used for early detection. However, BUS images are often affected by speckle noise, low tissue contrast, and artifacts, which can compromise image analysis tasks like segmentation and classification. Nowadays, Deep Learning (DL)-based Computer-Aided Diagnosis (CAD) systems could significantly enhance clinical diagnosis by leveraging self-learning capabilities to extract a sophisticated hierarchy of features from images. However, DL models often lack transparency in their internal decision-making processes, which is critical for sensitive applications like breast imaging. To address this, Explainable Artificial Intelligence (XAI) has emerged as a key approach to make DL models more transparent and interpretable for clinicians. This paper presents an efficient and fully automated DL-based CAD tool enhanced by XAI techniques for the precise exploration and diagnosis of BC using ultrasound images. The proposed CAD involves four key-steps: preprocessing, segmentation, XAI-based explainability, and feature extraction. In the preprocessing phase, an Autoencoder-based architecture is explored to effectively reduce speckle noise. For segmentation, our approach introduces an optimized architecture inspired by the DeepLabV3 + model. To ensure transparency in the model's predictions, Gradient-weighted Class Activation Mapping (Grad-CAM) is employed to provide interpretable insights into the decisions made by the deep neural network. Lastly, relevant features are extracted using the Gray-Level Co-occurrence Matrix (GLCM) technique. The proposed approach was rigourously evaluated on two publicly available benchmark datasets. For the first dataset (A), the evaluation metrics achieved were as follows: Dice coefficient (0.979), accuracy (0.935), intersection over union (0.955), precision (0.984), F1 score (0.981), and recall (0.980). Similarly, for the second dataset (B), the model showed notable improvements, achieving a Dice coefficient (0.981), accuracy (0.974), intersection over union (0.963), precision (0.986), F1 score (0.985), and recall (0.983).These results highlight the exceptional performance of the optimized DeepLabV3 + model in segmentation tasks, outperforming both U-Net and ResidualUnet architectures.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.