Exploring the risks of automation bias in healthcare artificial intelligence applications: A Bowtie analysis

IF 3.7 Q1 PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH
Moustafa Abdelwanis, Hamdan Khalaf Alarafati, Maram Muhanad Saleh Tammam, Mecit Can Emre Simsekler
{"title":"Exploring the risks of automation bias in healthcare artificial intelligence applications: A Bowtie analysis","authors":"Moustafa Abdelwanis,&nbsp;Hamdan Khalaf Alarafati,&nbsp;Maram Muhanad Saleh Tammam,&nbsp;Mecit Can Emre Simsekler","doi":"10.1016/j.jnlssr.2024.06.001","DOIUrl":null,"url":null,"abstract":"<div><div>This study conducts an in-depth review and Bowtie analysis of automation bias in AI-driven Clinical Decision Support Systems (CDSSs) within healthcare settings. Automation bias, the tendency of human operators to over-rely on automated systems, poses a critical challenge in implementing AI-driven technologies. To address this challenge, Bowtie analysis is employed to examine the causes and consequences of automation bias affected by over-reliance on AI-driven systems in healthcare. Furthermore, this study proposes preventive measures to address automation bias during the design phase of AI model development for CDSSs, along with effective mitigation strategies post-deployment. The findings highlight the imperative role of a systems approach, integrating technological advancements, regulatory frameworks, and collaborative endeavors between AI developers and healthcare practitioners to diminish automation bias in AI-driven CDSSs. We further identify future research directions, proposing quantitative evaluations of the mitigation and preventative measures.</div></div>","PeriodicalId":62710,"journal":{"name":"安全科学与韧性(英文)","volume":"5 4","pages":"Pages 460-469"},"PeriodicalIF":3.7000,"publicationDate":"2024-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"安全科学与韧性(英文)","FirstCategoryId":"1087","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666449624000410","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH","Score":null,"Total":0}
引用次数: 0

Abstract

This study conducts an in-depth review and Bowtie analysis of automation bias in AI-driven Clinical Decision Support Systems (CDSSs) within healthcare settings. Automation bias, the tendency of human operators to over-rely on automated systems, poses a critical challenge in implementing AI-driven technologies. To address this challenge, Bowtie analysis is employed to examine the causes and consequences of automation bias affected by over-reliance on AI-driven systems in healthcare. Furthermore, this study proposes preventive measures to address automation bias during the design phase of AI model development for CDSSs, along with effective mitigation strategies post-deployment. The findings highlight the imperative role of a systems approach, integrating technological advancements, regulatory frameworks, and collaborative endeavors between AI developers and healthcare practitioners to diminish automation bias in AI-driven CDSSs. We further identify future research directions, proposing quantitative evaluations of the mitigation and preventative measures.
探索医疗人工智能应用中的自动化偏差风险:Bowtie 分析
本研究对医疗机构中人工智能驱动的临床决策支持系统(CDSS)中的自动化偏差进行了深入评述和 Bowtie 分析。自动化偏差是指人类操作员过度依赖自动化系统的倾向,这对人工智能驱动技术的实施提出了严峻的挑战。为了应对这一挑战,本研究采用了 Bowtie 分析法来研究医疗保健领域过度依赖人工智能驱动系统所导致的自动化偏差的原因和后果。此外,本研究还提出了在 CDSS 的人工智能模型开发设计阶段解决自动化偏差的预防措施,以及部署后的有效缓解策略。研究结果强调了系统方法的重要作用,即整合技术进步、监管框架以及人工智能开发人员和医疗从业人员之间的合作努力,以减少人工智能驱动的 CDSS 中的自动化偏差。我们进一步确定了未来的研究方向,提出了对缓解和预防措施的定量评估。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
安全科学与韧性(英文)
安全科学与韧性(英文) Management Science and Operations Research, Safety, Risk, Reliability and Quality, Safety Research
CiteScore
8.70
自引率
0.00%
发文量
0
审稿时长
72 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信