Algorithmic Discrimination in Public Service Provision: Understanding Citizens’ Attribution of Responsibility for Human vs Algorithmic Discriminatory Outcomes

IF 6.3 1区 管理学 Q1 POLITICAL SCIENCE
Saar Alon-Barkat, Madalina Busuioc, Kayla Schwoerer, Kristina S Weißmüller
{"title":"Algorithmic Discrimination in Public Service Provision: Understanding Citizens’ Attribution of Responsibility for Human vs Algorithmic Discriminatory Outcomes","authors":"Saar Alon-Barkat, Madalina Busuioc, Kayla Schwoerer, Kristina S Weißmüller","doi":"10.1093/jopart/muaf024","DOIUrl":null,"url":null,"abstract":"As public bodies increasingly adopt AI technologies in their work, there is simultaneously growing attention to the risk that the reliance on the technology may introduce biases and produce discriminatory administrative outcomes, as demonstrated by multiple real-world cases. Our contribution addresses a core theoretical puzzle: With AI algorithms being increasingly embedded across public services, we lack crucial knowledge about how citizens assign responsibility to public organizations for algorithmic failures and discrimination in public services compared to human discrimination. This speaks to key questions as to whether organizational responsibility attribution mechanisms and public demand for consequences fundamentally change in the context of algorithmic governance. Addressing this gap, we examine whether individual citizens are less likely to attribute responsibility for algorithmic vs human discrimination in public service provision. Building on psychology literature, we further theorize potential mechanisms that underlie these effects and shape citizens’ responses. We investigate these research questions through a pre-registered survey experiment conducted in the Netherlands (N=2,483). Our findings indicate that public organizations are not held to a lower responsibility standard for algorithmic compared to human discrimination. Technological delegation to AI does not allow public bodies to bypass responsibility for discriminatory outcomes. However, we find that citizens assign more responsibility for algorithmic discrimination when the algorithm is developed inhouse rather than externally. This could lead to the emergence of accountability deficits pertaining to technological outsourcing.","PeriodicalId":48366,"journal":{"name":"Journal of Public Administration Research and Theory","volume":"148 1","pages":""},"PeriodicalIF":6.3000,"publicationDate":"2025-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Public Administration Research and Theory","FirstCategoryId":"91","ListUrlMain":"https://doi.org/10.1093/jopart/muaf024","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"POLITICAL SCIENCE","Score":null,"Total":0}
引用次数: 0

Abstract

As public bodies increasingly adopt AI technologies in their work, there is simultaneously growing attention to the risk that the reliance on the technology may introduce biases and produce discriminatory administrative outcomes, as demonstrated by multiple real-world cases. Our contribution addresses a core theoretical puzzle: With AI algorithms being increasingly embedded across public services, we lack crucial knowledge about how citizens assign responsibility to public organizations for algorithmic failures and discrimination in public services compared to human discrimination. This speaks to key questions as to whether organizational responsibility attribution mechanisms and public demand for consequences fundamentally change in the context of algorithmic governance. Addressing this gap, we examine whether individual citizens are less likely to attribute responsibility for algorithmic vs human discrimination in public service provision. Building on psychology literature, we further theorize potential mechanisms that underlie these effects and shape citizens’ responses. We investigate these research questions through a pre-registered survey experiment conducted in the Netherlands (N=2,483). Our findings indicate that public organizations are not held to a lower responsibility standard for algorithmic compared to human discrimination. Technological delegation to AI does not allow public bodies to bypass responsibility for discriminatory outcomes. However, we find that citizens assign more responsibility for algorithmic discrimination when the algorithm is developed inhouse rather than externally. This could lead to the emergence of accountability deficits pertaining to technological outsourcing.
公共服务提供中的算法歧视:理解公民对人的责任归属与算法歧视结果
随着公共机构越来越多地在其工作中采用人工智能技术,与此同时,人们越来越关注对技术的依赖可能会引入偏见并产生歧视性行政结果的风险,正如多个现实世界案例所证明的那样。我们的贡献解决了一个核心理论难题:随着人工智能算法越来越多地嵌入到公共服务中,我们缺乏关于公民如何将公共服务中的算法失败和歧视与人类歧视相比分配给公共组织的关键知识。这就提出了关键问题,即在算法治理的背景下,组织责任归因机制和公众对后果的需求是否从根本上发生了变化。为了解决这一差距,我们研究了公民个人是否不太可能将公共服务提供中的算法歧视与人类歧视归咎于责任。在心理学文献的基础上,我们进一步理论化这些效应背后的潜在机制,并塑造公民的反应。我们通过在荷兰进行的预注册调查实验(N=2,483)来调查这些研究问题。我们的研究结果表明,与人类歧视相比,公共组织对算法的责任标准并不低。对人工智能的技术授权不允许公共机构回避对歧视性结果的责任。然而,我们发现,当算法是在内部而不是外部开发时,公民对算法歧视的责任更多。这可能导致出现与技术外包有关的问责制赤字。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
8.50
自引率
11.90%
发文量
46
期刊介绍: The Journal of Public Administration Research and Theory serves as a bridge between public administration or public management scholarship and public policy studies. The Journal aims to provide in-depth analysis of developments in the organizational, administrative, and policy sciences as they apply to government and governance. Each issue brings you critical perspectives and cogent analyses, serving as an outlet for the best theoretical and research work in the field. The Journal of Public Administration Research and Theory is the official journal of the Public Management Research Association.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信