M. Champendal , R.T. Ribeiro , H. Müller , J.O. Prior , C. Sá dos Reis
{"title":"User-centric eXplainable AI criteria for implementing AI-based denoising in PET/CT","authors":"M. Champendal , R.T. Ribeiro , H. Müller , J.O. Prior , C. Sá dos Reis","doi":"10.1016/j.radi.2025.103194","DOIUrl":null,"url":null,"abstract":"<div><h3>Introduction</h3><div>The clinical adoption of AI-based denoising in PET/CT relies on the development of transparent and trustworthy tools that align with the radiographers' needs and support integration into routine practice. This study aims to determine the key characteristics of an eXplainable Artificial Intelligence (XAI)/tool aligning the radiographers' needs to facilitate the clinical adoption of AI-based denoising algorithm in PET/CT.</div></div><div><h3>Methods</h3><div>Two focus groups were organised, involving ten voluntary participants recruited from nuclear medicine departments from Western-Switzerland, forming a convenience sample of radiographers. Two different scenarios, matching or mismatching the ground truth were used to identify their needs and the questions they would like to ask to understand the AI-denoising algorithm. Additionally, the characteristics that an XAI tool should possess to best meet their needs were investigated. Content analysis was performed following the three steps outlined by Wanlin. Ethics cleared the study.</div></div><div><h3>Results</h3><div>Ten radiographers (aged 31-60y) identified two levels of explanation: (1) simple, global explanations with numerical confidence levels for rapid understanding in routine settings; (2) detailed, case-specific explanations using mixed formats where necessary, depending on the clinical situation and users to build confidence and support decision-making. Key questions include the functions of the algorithm (‘what’), the clinical context (‘when’) and the dependency of the results (‘how’). An effective XAI tool should be easy, adaptable, user-friendly and not disruptive to workflows.</div></div><div><h3>Conclusion</h3><div>Radiographers need two levels of explanation from XAI tools: global summaries that preserve workflow efficiency and detailed, case-specific insights when needed. Meeting these needs is key to fostering trust, understanding, and integration of AI-based denoising in PET/CT.</div></div><div><h3>Implications for practice</h3><div>Implementing adaptive XAI tools tailored to radiographers’ needs can support clinical workflows and accelerate the adoption of AI in PET/CT imaging.</div></div>","PeriodicalId":47416,"journal":{"name":"Radiography","volume":"31 6","pages":"Article 103194"},"PeriodicalIF":2.8000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Radiography","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1078817425003384","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0
Abstract
Introduction
The clinical adoption of AI-based denoising in PET/CT relies on the development of transparent and trustworthy tools that align with the radiographers' needs and support integration into routine practice. This study aims to determine the key characteristics of an eXplainable Artificial Intelligence (XAI)/tool aligning the radiographers' needs to facilitate the clinical adoption of AI-based denoising algorithm in PET/CT.
Methods
Two focus groups were organised, involving ten voluntary participants recruited from nuclear medicine departments from Western-Switzerland, forming a convenience sample of radiographers. Two different scenarios, matching or mismatching the ground truth were used to identify their needs and the questions they would like to ask to understand the AI-denoising algorithm. Additionally, the characteristics that an XAI tool should possess to best meet their needs were investigated. Content analysis was performed following the three steps outlined by Wanlin. Ethics cleared the study.
Results
Ten radiographers (aged 31-60y) identified two levels of explanation: (1) simple, global explanations with numerical confidence levels for rapid understanding in routine settings; (2) detailed, case-specific explanations using mixed formats where necessary, depending on the clinical situation and users to build confidence and support decision-making. Key questions include the functions of the algorithm (‘what’), the clinical context (‘when’) and the dependency of the results (‘how’). An effective XAI tool should be easy, adaptable, user-friendly and not disruptive to workflows.
Conclusion
Radiographers need two levels of explanation from XAI tools: global summaries that preserve workflow efficiency and detailed, case-specific insights when needed. Meeting these needs is key to fostering trust, understanding, and integration of AI-based denoising in PET/CT.
Implications for practice
Implementing adaptive XAI tools tailored to radiographers’ needs can support clinical workflows and accelerate the adoption of AI in PET/CT imaging.
RadiographyRADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING-
CiteScore
4.70
自引率
34.60%
发文量
169
审稿时长
63 days
期刊介绍:
Radiography is an International, English language, peer-reviewed journal of diagnostic imaging and radiation therapy. Radiography is the official professional journal of the College of Radiographers and is published quarterly. Radiography aims to publish the highest quality material, both clinical and scientific, on all aspects of diagnostic imaging and radiation therapy and oncology.