Responsible AI innovation in the public sector: Lessons from and recommendations for facilitating Fundamental Rights and Algorithms Impact Assessments

I.M. Muis, J. Straatman, B.A. Kamphorst
{"title":"Responsible AI innovation in the public sector: Lessons from and recommendations for facilitating Fundamental Rights and Algorithms Impact Assessments","authors":"I.M. Muis,&nbsp;J. Straatman,&nbsp;B.A. Kamphorst","doi":"10.1016/j.jrt.2025.100118","DOIUrl":null,"url":null,"abstract":"<div><div>Since the initial development of the Fundamental Rights and Algorithms Impact Assessment (FRAIA) in 2021, there has been an increasing interest from public sector organizations to gain experience with performing a FRAIA in contexts of developing, procuring, and deploying AI systems. In this contribution, we share observations from fifteen FRAIA trajectories performed in the field within the Dutch public sector context. Based on our experiences facilitating these trajectories, we offer a set of recommendations directed at practitioners with the aim of helping organizations make the best use of FRAIA and similar impact assessment instruments. We conclude by calling for the development of an informal FRAIA community in which practical handholds and advice can be shared to promote responsible AI innovation by ensuring that the human decision making around AI and other algorithms is well informed and well documented with respect to the protection of fundamental rights.</div></div>","PeriodicalId":73937,"journal":{"name":"Journal of responsible technology","volume":"22 ","pages":"Article 100118"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of responsible technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666659625000149","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Since the initial development of the Fundamental Rights and Algorithms Impact Assessment (FRAIA) in 2021, there has been an increasing interest from public sector organizations to gain experience with performing a FRAIA in contexts of developing, procuring, and deploying AI systems. In this contribution, we share observations from fifteen FRAIA trajectories performed in the field within the Dutch public sector context. Based on our experiences facilitating these trajectories, we offer a set of recommendations directed at practitioners with the aim of helping organizations make the best use of FRAIA and similar impact assessment instruments. We conclude by calling for the development of an informal FRAIA community in which practical handholds and advice can be shared to promote responsible AI innovation by ensuring that the human decision making around AI and other algorithms is well informed and well documented with respect to the protection of fundamental rights.
公共部门负责任的人工智能创新:促进基本权利和算法影响评估的经验教训和建议
自2021年基本权利和算法影响评估(FRAIA)的初步开发以来,公共部门组织越来越有兴趣在开发、采购和部署人工智能系统的背景下获得执行FRAIA的经验。在这篇文章中,我们分享了在荷兰公共部门背景下实地执行的15个FRAIA轨迹的观察结果。基于我们促进这些轨迹的经验,我们提供了一组针对从业者的建议,目的是帮助组织最好地利用FRAIA和类似的影响评估工具。最后,我们呼吁建立一个非正式的FRAIA社区,在这个社区中,可以分享实践经验和建议,通过确保围绕人工智能和其他算法的人类决策在保护基本权利方面得到充分的了解和充分的记录,促进负责任的人工智能创新。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of responsible technology
Journal of responsible technology Information Systems, Artificial Intelligence, Human-Computer Interaction
CiteScore
3.60
自引率
0.00%
发文量
0
审稿时长
168 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信