“Hey SyRI, tell me about algorithmic accountability”: Lessons from a landmark case

IF 1.8 Q3 PUBLIC ADMINISTRATION
Data & policy Pub Date : 2023-01-10 DOI:10.1017/dap.2022.39
M. Wieringa
{"title":"“Hey SyRI, tell me about algorithmic accountability”: Lessons from a landmark case","authors":"M. Wieringa","doi":"10.1017/dap.2022.39","DOIUrl":null,"url":null,"abstract":"Abstract The promised merits of data-driven innovation in general and algorithmic systems in particular hardly need enumeration. However, as decision-making tasks are increasingly delegated to algorithmic systems, this raises questions about accountability. These pressing questions of algorithmic accountability, particularly with regard to data-driven innovation in the public sector, deserve ample scholarly attention. Therefore, this paper brings together perspectives from governance studies and critical algorithm studies to assess how algorithmic accountability succeeds or falls short in practice and analyses the Dutch System Risk Indication (SyRI) as an empirical case. Dissecting a concrete case teases out to which degree archetypical accountability practices and processes function in relation to algorithmic decision-making processes, and which new questions concerning algorithmic accountability emerge therein. The case is approached through the analysis of “scavenged” material. It was found that while these archetypical accountability processes and practices can be incredibly productive in dealing with algorithmic systems they are simultaneously at risk. The current accountability configurations hinge predominantly on the ex ante sensitivity and responsiveness of the political fora. When these prove insufficient, mitigation in medias res/ex post is very difficult for other actants. In part, this is not a new phenomenon, but it is amplified in relation to algorithmic systems. Different fora ask different kinds of medium-specific questions to the actor, from different perspectives with varying power relations. These algorithm-specific considerations relate to the decision-making around an algorithmic system, their functionality, and their deployment. Strengthening ex ante political accountability fora to these algorithm-specific considerations could help mitigate this.","PeriodicalId":93427,"journal":{"name":"Data & policy","volume":" ","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2023-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data & policy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1017/dap.2022.39","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PUBLIC ADMINISTRATION","Score":null,"Total":0}
引用次数: 1

Abstract

Abstract The promised merits of data-driven innovation in general and algorithmic systems in particular hardly need enumeration. However, as decision-making tasks are increasingly delegated to algorithmic systems, this raises questions about accountability. These pressing questions of algorithmic accountability, particularly with regard to data-driven innovation in the public sector, deserve ample scholarly attention. Therefore, this paper brings together perspectives from governance studies and critical algorithm studies to assess how algorithmic accountability succeeds or falls short in practice and analyses the Dutch System Risk Indication (SyRI) as an empirical case. Dissecting a concrete case teases out to which degree archetypical accountability practices and processes function in relation to algorithmic decision-making processes, and which new questions concerning algorithmic accountability emerge therein. The case is approached through the analysis of “scavenged” material. It was found that while these archetypical accountability processes and practices can be incredibly productive in dealing with algorithmic systems they are simultaneously at risk. The current accountability configurations hinge predominantly on the ex ante sensitivity and responsiveness of the political fora. When these prove insufficient, mitigation in medias res/ex post is very difficult for other actants. In part, this is not a new phenomenon, but it is amplified in relation to algorithmic systems. Different fora ask different kinds of medium-specific questions to the actor, from different perspectives with varying power relations. These algorithm-specific considerations relate to the decision-making around an algorithmic system, their functionality, and their deployment. Strengthening ex ante political accountability fora to these algorithm-specific considerations could help mitigate this.
“嘿,SyRI,告诉我算法问责制”:从一个具有里程碑意义的案例中得到的教训
摘要数据驱动的创新在一般情况下,特别是算法系统中所承诺的优点几乎不需要列举。然而,随着决策任务越来越多地委托给算法系统,这就提出了问责制的问题。算法问责制的这些紧迫问题,特别是在公共部门的数据驱动创新方面,值得学术界给予充分关注。因此,本文将治理研究和关键算法研究的视角结合起来,评估算法问责制在实践中是如何成功或失败的,并将荷兰系统风险指示(SyRI)作为一个实证案例进行分析。通过分析一个具体的案例,可以发现原型问责制实践和过程在多大程度上与算法决策过程相关,以及其中出现了哪些关于算法问责制的新问题。本案是通过对“拾荒”材料的分析来处理的。研究发现,虽然这些典型的问责过程和实践在处理算法系统时可以产生令人难以置信的成效,但它们同时也面临风险。目前的问责格局主要取决于政治论坛的事前敏感性和反应能力。当这些证明不充分时,其他行为人很难在事后/事后媒体上减轻处罚。在某种程度上,这不是一个新现象,但它在算法系统中被放大了。不同的论坛从不同的角度、不同的权力关系向行动者提出了不同类型的媒介特定问题。这些特定于算法的考虑与围绕算法系统的决策、它们的功能和它们的部署有关。加强对这些算法特定考虑的事前政治问责论坛可能有助于缓解这种情况。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
3.10
自引率
0.00%
发文量
0
审稿时长
12 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信