Olya Rezaeian , Onur Asan , Alparslan Emrah Bayrak
{"title":"The impact of AI explanations on clinicians’ trust and diagnostic accuracy in breast cancer","authors":"Olya Rezaeian , Onur Asan , Alparslan Emrah Bayrak","doi":"10.1016/j.apergo.2025.104577","DOIUrl":null,"url":null,"abstract":"<div><div>Advances in machine learning have created new opportunities to develop artificial intelligence (AI)-based clinical decision support systems using past clinical data and improve diagnosis decisions in life-threatening illnesses such breast cancer. Providing explanations for AI recommendations is a possible way to address trust and usability issues in black-box AI systems. This paper presents the results of an experiment to assess the impact of varying levels of AI explanations on clinicians’ trust and diagnosis accuracy in a breast cancer application and the impact of demographics on the findings. The study includes 28 clinicians with varying medical roles related to breast cancer diagnosis. The results show that increasing levels of explanations do not always improve trust or diagnosis performance. The results also show that while some of the self-reported measures such as AI familiarity depend on gender, age and experience, the behavioral assessments of trust and performance are independent of those variables.</div></div>","PeriodicalId":55502,"journal":{"name":"Applied Ergonomics","volume":"129 ","pages":"Article 104577"},"PeriodicalIF":3.1000,"publicationDate":"2025-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Ergonomics","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0003687025001139","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 0
Abstract
Advances in machine learning have created new opportunities to develop artificial intelligence (AI)-based clinical decision support systems using past clinical data and improve diagnosis decisions in life-threatening illnesses such breast cancer. Providing explanations for AI recommendations is a possible way to address trust and usability issues in black-box AI systems. This paper presents the results of an experiment to assess the impact of varying levels of AI explanations on clinicians’ trust and diagnosis accuracy in a breast cancer application and the impact of demographics on the findings. The study includes 28 clinicians with varying medical roles related to breast cancer diagnosis. The results show that increasing levels of explanations do not always improve trust or diagnosis performance. The results also show that while some of the self-reported measures such as AI familiarity depend on gender, age and experience, the behavioral assessments of trust and performance are independent of those variables.
期刊介绍:
Applied Ergonomics is aimed at ergonomists and all those interested in applying ergonomics/human factors in the design, planning and management of technical and social systems at work or leisure. Readership is truly international with subscribers in over 50 countries. Professionals for whom Applied Ergonomics is of interest include: ergonomists, designers, industrial engineers, health and safety specialists, systems engineers, design engineers, organizational psychologists, occupational health specialists and human-computer interaction specialists.