Estela Ribeiro, Diego A C Cardenas, Felipe M Dias, Jose E Krieger, Marco A Gutierrez
{"title":"Explainable artificial intelligence in deep learning-based detection of aortic elongation on chest X-ray images.","authors":"Estela Ribeiro, Diego A C Cardenas, Felipe M Dias, Jose E Krieger, Marco A Gutierrez","doi":"10.1093/ehjdh/ztae045","DOIUrl":null,"url":null,"abstract":"<p><strong>Aims: </strong>Aortic elongation can result from age-related changes, congenital factors, aneurysms, or conditions affecting blood vessel elasticity. It is associated with cardiovascular diseases and severe complications like aortic aneurysms and dissection. We assess qualitatively and quantitatively explainable methods to understand the decisions of a deep learning model for detecting aortic elongation using chest X-ray (CXR) images.</p><p><strong>Methods and results: </strong>In this study, we evaluated the performance of deep learning models (DenseNet and EfficientNet) for detecting aortic elongation using transfer learning and fine-tuning techniques with CXR images as input. EfficientNet achieved higher accuracy (86.7% <math><mo>±</mo></math> 2.1), precision (82.7% <math><mo>±</mo></math> 2.7), specificity (89.4% <math><mo>±</mo></math> 1.7), F1 score (82.5% <math><mo>±</mo></math> 2.9), and area under the receiver operating characteristic (92.7% <math><mo>±</mo></math> 0.6) but lower sensitivity (82.3% <math><mo>±</mo></math> 3.2) compared with DenseNet. To gain insights into the decision-making process of these models, we employed gradient-weighted class activation mapping and local interpretable model-agnostic explanations explainability methods, which enabled us to identify the expected location of aortic elongation in CXR images. Additionally, we used the pixel-flipping method to quantitatively assess the model interpretations, providing valuable insights into model behaviour.</p><p><strong>Conclusion: </strong>Our study presents a comprehensive strategy for analysing CXR images by integrating aortic elongation detection models with explainable artificial intelligence techniques. By enhancing the interpretability and understanding of the models' decisions, this approach holds promise for aiding clinicians in timely and accurate diagnosis, potentially improving patient outcomes in clinical practice.</p>","PeriodicalId":72965,"journal":{"name":"European heart journal. Digital health","volume":null,"pages":null},"PeriodicalIF":3.9000,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11417491/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"European heart journal. Digital health","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/ehjdh/ztae045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/9/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"CARDIAC & CARDIOVASCULAR SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Aims: Aortic elongation can result from age-related changes, congenital factors, aneurysms, or conditions affecting blood vessel elasticity. It is associated with cardiovascular diseases and severe complications like aortic aneurysms and dissection. We assess qualitatively and quantitatively explainable methods to understand the decisions of a deep learning model for detecting aortic elongation using chest X-ray (CXR) images.
Methods and results: In this study, we evaluated the performance of deep learning models (DenseNet and EfficientNet) for detecting aortic elongation using transfer learning and fine-tuning techniques with CXR images as input. EfficientNet achieved higher accuracy (86.7% 2.1), precision (82.7% 2.7), specificity (89.4% 1.7), F1 score (82.5% 2.9), and area under the receiver operating characteristic (92.7% 0.6) but lower sensitivity (82.3% 3.2) compared with DenseNet. To gain insights into the decision-making process of these models, we employed gradient-weighted class activation mapping and local interpretable model-agnostic explanations explainability methods, which enabled us to identify the expected location of aortic elongation in CXR images. Additionally, we used the pixel-flipping method to quantitatively assess the model interpretations, providing valuable insights into model behaviour.
Conclusion: Our study presents a comprehensive strategy for analysing CXR images by integrating aortic elongation detection models with explainable artificial intelligence techniques. By enhancing the interpretability and understanding of the models' decisions, this approach holds promise for aiding clinicians in timely and accurate diagnosis, potentially improving patient outcomes in clinical practice.