{"title":"用于医学成像的可解释人工智能:红外乳腺图像回顾与实验","authors":"Kaushik Raghavan, Sivaselvan Balasubramanian, Kamakoti Veezhinathan","doi":"10.1111/coin.12660","DOIUrl":null,"url":null,"abstract":"<p>There is a growing trend of using artificial intelligence, particularly deep learning algorithms, in medical diagnostics, revolutionizing healthcare by improving efficiency, accuracy, and patient outcomes. However, the use of artificial intelligence in medical diagnostics comes with the critical need to explain the reasoning behind artificial intelligence-based predictions and ensure transparency in decision-making. Explainable artificial intelligence has emerged as a crucial research area to address the need for transparency and interpretability in medical diagnostics. Explainable artificial intelligence techniques aim to provide insights into the decision-making process of artificial intelligence systems, enabling clinicians to understand the factors the algorithms consider in reaching their predictions. This paper presents a detailed review of saliency-based (visual) methods, such as class activation methods, which have gained popularity in medical imaging as they provide visual explanations by highlighting the regions of an image most influential in the artificial intelligence's decision. We also present the literature on non-visual methods, but the focus will be on visual methods. We also use the existing literature to experiment with infrared breast images for detecting breast cancer. Towards the end of this paper, we also propose an “attention guided Grad-CAM” that enhances the visualizations for explainable artificial intelligence. The existing literature shows that explainable artificial intelligence techniques are not explored in the context of infrared medical images and opens up a wide range of opportunities for further research to make clinical thermography into assistive technology for the medical community.</p>","PeriodicalId":55228,"journal":{"name":"Computational Intelligence","volume":"40 3","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2024-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Explainable artificial intelligence for medical imaging: Review and experiments with infrared breast images\",\"authors\":\"Kaushik Raghavan, Sivaselvan Balasubramanian, Kamakoti Veezhinathan\",\"doi\":\"10.1111/coin.12660\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>There is a growing trend of using artificial intelligence, particularly deep learning algorithms, in medical diagnostics, revolutionizing healthcare by improving efficiency, accuracy, and patient outcomes. However, the use of artificial intelligence in medical diagnostics comes with the critical need to explain the reasoning behind artificial intelligence-based predictions and ensure transparency in decision-making. Explainable artificial intelligence has emerged as a crucial research area to address the need for transparency and interpretability in medical diagnostics. Explainable artificial intelligence techniques aim to provide insights into the decision-making process of artificial intelligence systems, enabling clinicians to understand the factors the algorithms consider in reaching their predictions. This paper presents a detailed review of saliency-based (visual) methods, such as class activation methods, which have gained popularity in medical imaging as they provide visual explanations by highlighting the regions of an image most influential in the artificial intelligence's decision. We also present the literature on non-visual methods, but the focus will be on visual methods. We also use the existing literature to experiment with infrared breast images for detecting breast cancer. Towards the end of this paper, we also propose an “attention guided Grad-CAM” that enhances the visualizations for explainable artificial intelligence. The existing literature shows that explainable artificial intelligence techniques are not explored in the context of infrared medical images and opens up a wide range of opportunities for further research to make clinical thermography into assistive technology for the medical community.</p>\",\"PeriodicalId\":55228,\"journal\":{\"name\":\"Computational Intelligence\",\"volume\":\"40 3\",\"pages\":\"\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-06-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/coin.12660\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/coin.12660","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Explainable artificial intelligence for medical imaging: Review and experiments with infrared breast images
There is a growing trend of using artificial intelligence, particularly deep learning algorithms, in medical diagnostics, revolutionizing healthcare by improving efficiency, accuracy, and patient outcomes. However, the use of artificial intelligence in medical diagnostics comes with the critical need to explain the reasoning behind artificial intelligence-based predictions and ensure transparency in decision-making. Explainable artificial intelligence has emerged as a crucial research area to address the need for transparency and interpretability in medical diagnostics. Explainable artificial intelligence techniques aim to provide insights into the decision-making process of artificial intelligence systems, enabling clinicians to understand the factors the algorithms consider in reaching their predictions. This paper presents a detailed review of saliency-based (visual) methods, such as class activation methods, which have gained popularity in medical imaging as they provide visual explanations by highlighting the regions of an image most influential in the artificial intelligence's decision. We also present the literature on non-visual methods, but the focus will be on visual methods. We also use the existing literature to experiment with infrared breast images for detecting breast cancer. Towards the end of this paper, we also propose an “attention guided Grad-CAM” that enhances the visualizations for explainable artificial intelligence. The existing literature shows that explainable artificial intelligence techniques are not explored in the context of infrared medical images and opens up a wide range of opportunities for further research to make clinical thermography into assistive technology for the medical community.
期刊介绍:
This leading international journal promotes and stimulates research in the field of artificial intelligence (AI). Covering a wide range of issues - from the tools and languages of AI to its philosophical implications - Computational Intelligence provides a vigorous forum for the publication of both experimental and theoretical research, as well as surveys and impact studies. The journal is designed to meet the needs of a wide range of AI workers in academic and industrial research.