Two paths for health AI governance: paternalism or democracy.

Future healthcare journal Pub Date : 2024-09-19 eCollection Date: 2024-09-01 DOI:10.1016/j.fhj.2024.100180
Cori Crider
{"title":"Two paths for health AI governance: paternalism or democracy.","authors":"Cori Crider","doi":"10.1016/j.fhj.2024.100180","DOIUrl":null,"url":null,"abstract":"<p><p>This article assesses the cyclical failures of NHS data modernisation programmes, and considers that they fail because they proceed from a faulty - excessively paternalistic - governance model. Bias in algorithmic delivery of healthcare, a demonstrated problem with many existing health applications, is another serious risk. To regain trust and move towards better use of data in the NHS, we should democratise the development of these systems, and de-risk operational systems from issues such as automation bias. As a comparison, the essay explores two approaches to trust and bias problems in other contexts: Taiwan's digital democracy, and American Airlines' struggles to overcome automation bias in their pilots.</p>","PeriodicalId":73125,"journal":{"name":"Future healthcare journal","volume":"11 3","pages":"100180"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11452825/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Future healthcare journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1016/j.fhj.2024.100180","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/9/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This article assesses the cyclical failures of NHS data modernisation programmes, and considers that they fail because they proceed from a faulty - excessively paternalistic - governance model. Bias in algorithmic delivery of healthcare, a demonstrated problem with many existing health applications, is another serious risk. To regain trust and move towards better use of data in the NHS, we should democratise the development of these systems, and de-risk operational systems from issues such as automation bias. As a comparison, the essay explores two approaches to trust and bias problems in other contexts: Taiwan's digital democracy, and American Airlines' struggles to overcome automation bias in their pilots.

健康人工智能治理的两条道路:家长制还是民主制。
本文对英国国家医疗服务系统数据现代化计划的周期性失败进行了评估,认为这些计划之所以失败,是因为其管理模式存在问题,即过度家长式管理。算法提供医疗服务的偏见是另一个严重的风险,许多现有的医疗应用程序都存在这个问题。为了重获信任并在国家医疗服务体系中更好地利用数据,我们应该使这些系统的开发民主化,并降低操作系统的风险,避免出现自动化偏差等问题。作为比较,本文探讨了其他背景下解决信任和偏见问题的两种方法:台湾的数字民主,以及美国航空公司在克服飞行员自动化偏见方面所做的努力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信