用户参与开发以人为本的负责任人工智能系统:何时以及如何?

Beatrice Vincenzi, Simone Stumpf, Alex S. Taylor, Yuri Nakao
{"title":"用户参与开发以人为本的负责任人工智能系统:何时以及如何?","authors":"Beatrice Vincenzi, Simone Stumpf, Alex S. Taylor, Yuri Nakao","doi":"10.1145/3652592","DOIUrl":null,"url":null,"abstract":"Artificial Intelligence (AI) is increasingly used in mainstream applications to make decisions that affect a large number of people. While research has focused on involving machine learning and domain experts during the development of responsible AI systems, the input of lay users has too often been ignored. By exploring the involvement of lay users, our work seeks to advance human-centric responsible AI development processes. To reflect on lay users’ views, we conducted an online survey of 1121 people in the United Kingdom. We found that respondents had concerns about fairness and transparency of AI systems which requires more education around AI to underpin lay user involvement. They saw a need for having their views reflected at all stages of the AI development lifecycle. Lay users mainly charged internal stakeholders to oversee the development process but supported by an ethics committee and input from an external regulatory body. We also probed for possible techniques for involving lay users more directly. Our work has implications for creating processes that ensure the development of responsible AI systems that take lay user perspectives into account.","PeriodicalId":329595,"journal":{"name":"ACM Journal on Responsible Computing","volume":"20 11","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Lay User Involvement in Developing Human-Centric Responsible AI Systems: When and How?\",\"authors\":\"Beatrice Vincenzi, Simone Stumpf, Alex S. Taylor, Yuri Nakao\",\"doi\":\"10.1145/3652592\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Artificial Intelligence (AI) is increasingly used in mainstream applications to make decisions that affect a large number of people. While research has focused on involving machine learning and domain experts during the development of responsible AI systems, the input of lay users has too often been ignored. By exploring the involvement of lay users, our work seeks to advance human-centric responsible AI development processes. To reflect on lay users’ views, we conducted an online survey of 1121 people in the United Kingdom. We found that respondents had concerns about fairness and transparency of AI systems which requires more education around AI to underpin lay user involvement. They saw a need for having their views reflected at all stages of the AI development lifecycle. Lay users mainly charged internal stakeholders to oversee the development process but supported by an ethics committee and input from an external regulatory body. We also probed for possible techniques for involving lay users more directly. Our work has implications for creating processes that ensure the development of responsible AI systems that take lay user perspectives into account.\",\"PeriodicalId\":329595,\"journal\":{\"name\":\"ACM Journal on Responsible Computing\",\"volume\":\"20 11\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-03-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Journal on Responsible Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3652592\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Journal on Responsible Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3652592","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

人工智能(AI)正越来越多地应用于主流领域,以做出影响众多人的决策。在负责任的人工智能系统开发过程中,研究重点是让机器学习和领域专家参与其中,而非专业用户的意见往往被忽视。通过探索非专业用户的参与,我们的工作旨在推进以人为本的负责任人工智能开发流程。为了反映非专业用户的观点,我们在英国对 1121 人进行了在线调查。我们发现,受访者对人工智能系统的公平性和透明度表示担忧,这就需要围绕人工智能开展更多的教育活动,为非专业用户的参与奠定基础。他们认为有必要在人工智能开发生命周期的各个阶段反映他们的意见。非专业用户主要委托内部利益相关者监督开发过程,但得到伦理委员会的支持和外部监管机构的意见。我们还探索了让非专业用户更直接参与的可行技术。我们的工作对创建流程以确保开发出考虑到非专业用户观点的负责任的人工智能系统具有重要意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Lay User Involvement in Developing Human-Centric Responsible AI Systems: When and How?
Artificial Intelligence (AI) is increasingly used in mainstream applications to make decisions that affect a large number of people. While research has focused on involving machine learning and domain experts during the development of responsible AI systems, the input of lay users has too often been ignored. By exploring the involvement of lay users, our work seeks to advance human-centric responsible AI development processes. To reflect on lay users’ views, we conducted an online survey of 1121 people in the United Kingdom. We found that respondents had concerns about fairness and transparency of AI systems which requires more education around AI to underpin lay user involvement. They saw a need for having their views reflected at all stages of the AI development lifecycle. Lay users mainly charged internal stakeholders to oversee the development process but supported by an ethics committee and input from an external regulatory body. We also probed for possible techniques for involving lay users more directly. Our work has implications for creating processes that ensure the development of responsible AI systems that take lay user perspectives into account.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信