Preparing healthcare organisations for using artificial intelligence effectively.

IF 1.4
Ian A Scott, Anton van der Vegt, Stephen Canaris, Paul Nolan, Keren Pointon
{"title":"Preparing healthcare organisations for using artificial intelligence effectively.","authors":"Ian A Scott, Anton van der Vegt, Stephen Canaris, Paul Nolan, Keren Pointon","doi":"10.1071/AH25102","DOIUrl":null,"url":null,"abstract":"<p><p>Healthcare organisations (HCOs) must prepare for large-scale implementation of artificial intelligence (AI)-enabled tools that can demonstrably achieve one or more aims of better care, improved efficiency, enhanced professional and patient experience, and greater equity. Failure to do so may disadvantage patients, staff, and the organisation itself. We outline key strategies Australian HCOs should enact in maximising successful AI implementations: (1) establish transparent and accountable governance structures tasked to ensure responsible use of AI, including shifting organisational culture towards AI; (2) invest in delivering the human talent, technical infrastructure, and organisational change management that underpin a sustainable AI ecosystem; (3) gain staff and patient trust in using AI tools by virtue of their value to real world care and minimal threats to patient safety and privacy, existence of reliable governance, provision of appropriate training and opportunity for user co-design, transparency in AI tool use and consent, and retention of user agency in responding to AI generated advice; (4) establish risk assessment and mitigation processes that delineate unacceptable, high, medium, and low risk AI tools, based on task criticality and rigour of performance evaluations, and monitor and respond to any adverse impacts on patient outcomes; and (5) determine when and how liability for patient harm associated with a specific AI tool rests with, or is shared between, staff, developers, and the deploying HCO itself. In realising the benefits of AI, HCOs must build the necessary AI infrastructure, literacy, and cultural adaptation with foresighted planning and procurement of resources.</p>","PeriodicalId":93891,"journal":{"name":"Australian health review : a publication of the Australian Hospital Association","volume":" ","pages":""},"PeriodicalIF":1.4000,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Australian health review : a publication of the Australian Hospital Association","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1071/AH25102","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Healthcare organisations (HCOs) must prepare for large-scale implementation of artificial intelligence (AI)-enabled tools that can demonstrably achieve one or more aims of better care, improved efficiency, enhanced professional and patient experience, and greater equity. Failure to do so may disadvantage patients, staff, and the organisation itself. We outline key strategies Australian HCOs should enact in maximising successful AI implementations: (1) establish transparent and accountable governance structures tasked to ensure responsible use of AI, including shifting organisational culture towards AI; (2) invest in delivering the human talent, technical infrastructure, and organisational change management that underpin a sustainable AI ecosystem; (3) gain staff and patient trust in using AI tools by virtue of their value to real world care and minimal threats to patient safety and privacy, existence of reliable governance, provision of appropriate training and opportunity for user co-design, transparency in AI tool use and consent, and retention of user agency in responding to AI generated advice; (4) establish risk assessment and mitigation processes that delineate unacceptable, high, medium, and low risk AI tools, based on task criticality and rigour of performance evaluations, and monitor and respond to any adverse impacts on patient outcomes; and (5) determine when and how liability for patient harm associated with a specific AI tool rests with, or is shared between, staff, developers, and the deploying HCO itself. In realising the benefits of AI, HCOs must build the necessary AI infrastructure, literacy, and cultural adaptation with foresighted planning and procurement of resources.

为医疗机构有效使用人工智能做好准备。
医疗保健组织(hco)必须为大规模实施支持人工智能(AI)的工具做好准备,这些工具可以实现一个或多个目标,包括更好的护理、提高效率、增强专业和患者体验以及更大的公平性。如果做不到这一点,可能会对患者、员工和组织本身不利。我们概述了澳大利亚医疗保健机构在最大限度地成功实施人工智能方面应制定的关键战略:(1)建立透明和负责任的治理结构,以确保负责任地使用人工智能,包括将组织文化转向人工智能;(2)投资于提供人才、技术基础设施和组织变革管理,以支撑可持续的人工智能生态系统;(3)获得员工和患者对使用人工智能工具的信任,因为它们对现实世界的护理有价值,对患者安全和隐私的威胁最小,存在可靠的治理,为用户共同设计提供适当的培训和机会,人工智能工具使用和同意的透明度,以及在响应人工智能生成的建议时保留用户代理;(4)建立风险评估和缓解流程,根据任务的重要性和绩效评估的严密性,划定不可接受的、高、中、低风险的人工智能工具,并监测和应对对患者结果的任何不利影响;(5)确定与特定人工智能工具相关的患者伤害责任何时以及如何由员工、开发人员和部署HCO本身承担或分担。为了实现人工智能的好处,卫生保健组织必须建立必要的人工智能基础设施、扫盲和文化适应,并进行有远见的规划和资源采购。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信