{"title":"Enhancing explainability in medical image classification and analyzing osteonecrosis X-ray images using shadow learner system","authors":"Yaoyang Wu, Simon Fong, Liansheng Liu","doi":"10.1007/s10489-024-05916-x","DOIUrl":null,"url":null,"abstract":"<p>Numerous applications have explored medical image classification using deep learning models. With the emergence of Explainable AI (XAI), researchers have begun to recognize its potential in validating the authenticity and correctness of results produced by black-box deep learning models. On the other hand, current diagnostic approaches for osteonecrosis face significant challenges, including difficulty in early detection, subjectivity in image interpretation, and reliance on surgical interventions without a comprehensive diagnostic foundation. This paper presents a novel Medical Computer-Aid-Diagnosis System—the Shadow Learning System framework—which integrates a convolutional neural network (CNN) with an Explainable AI method. This system not only performs conventional computer-aiding-diagnosis functions but also uniquely exploits misclassified data samples to provide additional medically relevant information from the machine learning model’s perspective, assisting doctors in their diagnostic process. The implementation of XAI techniques in our proposed system goes beyond merely validating CNN model results; it also enables the extraction of valuable information from medical images through an unconventional machine learning perspective. Our paper aims to enhance and extend the general structure and detailed design of the Shadow Learner System, making it more advantageous not only for human users but also for the deep learning model itself. A case study on femoral head osteonecrosis was conducted using our proposed system, which demonstrated improved accuracy and reliability in its prediction results. Experimental results interpreted using XAI methods are visualized to prove the confidence of our proposed model that generates reasonable results, confirming the effectiveness of the proposed model.</p>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 2","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2024-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-024-05916-x","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Numerous applications have explored medical image classification using deep learning models. With the emergence of Explainable AI (XAI), researchers have begun to recognize its potential in validating the authenticity and correctness of results produced by black-box deep learning models. On the other hand, current diagnostic approaches for osteonecrosis face significant challenges, including difficulty in early detection, subjectivity in image interpretation, and reliance on surgical interventions without a comprehensive diagnostic foundation. This paper presents a novel Medical Computer-Aid-Diagnosis System—the Shadow Learning System framework—which integrates a convolutional neural network (CNN) with an Explainable AI method. This system not only performs conventional computer-aiding-diagnosis functions but also uniquely exploits misclassified data samples to provide additional medically relevant information from the machine learning model’s perspective, assisting doctors in their diagnostic process. The implementation of XAI techniques in our proposed system goes beyond merely validating CNN model results; it also enables the extraction of valuable information from medical images through an unconventional machine learning perspective. Our paper aims to enhance and extend the general structure and detailed design of the Shadow Learner System, making it more advantageous not only for human users but also for the deep learning model itself. A case study on femoral head osteonecrosis was conducted using our proposed system, which demonstrated improved accuracy and reliability in its prediction results. Experimental results interpreted using XAI methods are visualized to prove the confidence of our proposed model that generates reasonable results, confirming the effectiveness of the proposed model.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.