用户行为公平吗?通过大型面对面研究了解用户偏见

Yang Liu, Heather Moses, Mark Sternefeld, Samuel A. Malachowsky, Daniel E. Krutz
{"title":"用户行为公平吗?通过大型面对面研究了解用户偏见","authors":"Yang Liu, Heather Moses, Mark Sternefeld, Samuel A. Malachowsky, Daniel E. Krutz","doi":"10.1109/ICSE-SEIS58686.2023.00014","DOIUrl":null,"url":null,"abstract":"Inequitable software is a common problem. Bias may be caused by developers, or even software users. As a society, it is crucial that we understand and identify the causes and implications of software bias from both users and the software itself. To address the problems of inequitable software, it is essential that we inform and motivate the next generation of software developers regarding bias and its adverse impacts. However, research shows that there is a lack of easily adoptable ethics-focused educational material to support this effort.To address the problem of inequitable software, we created an easily adoptable, self-contained experiential activity that is designed to foster student interest in software ethics, with a specific emphasis on AI/ML bias. This activity involves participants selecting fictitious teammates based solely on their appearance. The participant then experiences bias either against themselves or a teammate by the activity’s fictitious AI. The created lab was then utilized in this study involving 173 real-world users (age 18-51+) to better understand user bias.The primary findings of our study include: I) Participants from minority ethnic groups have stronger feeling regarding being impacted by inequitable software/AI, II) Participants with higher interest in AI/ML have a higher belief for the priority of unbiased software, III) Users do not act in an equitable manner, as avatars with ‘dark’ skin color are less likely to be selected, and IV) Participants from different demographic groups exhibit similar behavior bias. The created experiential lab activity may be executed using only a browser and internet connection, and is publicly available on our project website: https://all.rit.edu.","PeriodicalId":427165,"journal":{"name":"2023 IEEE/ACM 45th International Conference on Software Engineering: Software Engineering in Society (ICSE-SEIS)","volume":"120 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Do Users Act Equitably? Understanding User Bias Through a Large In-person Study\",\"authors\":\"Yang Liu, Heather Moses, Mark Sternefeld, Samuel A. Malachowsky, Daniel E. Krutz\",\"doi\":\"10.1109/ICSE-SEIS58686.2023.00014\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Inequitable software is a common problem. Bias may be caused by developers, or even software users. As a society, it is crucial that we understand and identify the causes and implications of software bias from both users and the software itself. To address the problems of inequitable software, it is essential that we inform and motivate the next generation of software developers regarding bias and its adverse impacts. However, research shows that there is a lack of easily adoptable ethics-focused educational material to support this effort.To address the problem of inequitable software, we created an easily adoptable, self-contained experiential activity that is designed to foster student interest in software ethics, with a specific emphasis on AI/ML bias. This activity involves participants selecting fictitious teammates based solely on their appearance. The participant then experiences bias either against themselves or a teammate by the activity’s fictitious AI. The created lab was then utilized in this study involving 173 real-world users (age 18-51+) to better understand user bias.The primary findings of our study include: I) Participants from minority ethnic groups have stronger feeling regarding being impacted by inequitable software/AI, II) Participants with higher interest in AI/ML have a higher belief for the priority of unbiased software, III) Users do not act in an equitable manner, as avatars with ‘dark’ skin color are less likely to be selected, and IV) Participants from different demographic groups exhibit similar behavior bias. The created experiential lab activity may be executed using only a browser and internet connection, and is publicly available on our project website: https://all.rit.edu.\",\"PeriodicalId\":427165,\"journal\":{\"name\":\"2023 IEEE/ACM 45th International Conference on Software Engineering: Software Engineering in Society (ICSE-SEIS)\",\"volume\":\"120 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE/ACM 45th International Conference on Software Engineering: Software Engineering in Society (ICSE-SEIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICSE-SEIS58686.2023.00014\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE/ACM 45th International Conference on Software Engineering: Software Engineering in Society (ICSE-SEIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSE-SEIS58686.2023.00014","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

不公平的软件是一个普遍的问题。偏见可能是由开发人员,甚至是软件用户造成的。作为一个社会,我们理解和识别来自用户和软件本身的软件偏见的原因和影响是至关重要的。为了解决软件不公平的问题,我们必须告知并激励下一代软件开发人员关于偏见及其不利影响的信息。然而,研究表明,缺乏易于接受的以伦理为重点的教育材料来支持这一努力。为了解决软件不公平的问题,我们创建了一个易于采用的、独立的体验活动,旨在培养学生对软件伦理的兴趣,特别强调AI/ML偏见。这个活动包括参与者仅仅根据他们的外表来选择虚构的队友。然后参与者会体验到活动中虚构的AI对自己或队友的偏见。创建的实验室随后被用于这项涉及173名真实世界用户(18-51岁以上)的研究,以更好地了解用户偏见。我们研究的主要发现包括:1)来自少数民族的参与者对不公平的软件/人工智能的影响有更强烈的感觉,2)对人工智能/机器学习感兴趣的参与者对无偏见软件的优先级有更高的信念,3)用户不以公平的方式行动,因为“深色”肤色的虚拟角色不太可能被选中,4)来自不同人口统计群体的参与者表现出类似的行为偏见。创建的体验式实验室活动可以只使用浏览器和互联网连接来执行,并且可以在我们的项目网站上公开获取:https://all.rit.edu。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Do Users Act Equitably? Understanding User Bias Through a Large In-person Study
Inequitable software is a common problem. Bias may be caused by developers, or even software users. As a society, it is crucial that we understand and identify the causes and implications of software bias from both users and the software itself. To address the problems of inequitable software, it is essential that we inform and motivate the next generation of software developers regarding bias and its adverse impacts. However, research shows that there is a lack of easily adoptable ethics-focused educational material to support this effort.To address the problem of inequitable software, we created an easily adoptable, self-contained experiential activity that is designed to foster student interest in software ethics, with a specific emphasis on AI/ML bias. This activity involves participants selecting fictitious teammates based solely on their appearance. The participant then experiences bias either against themselves or a teammate by the activity’s fictitious AI. The created lab was then utilized in this study involving 173 real-world users (age 18-51+) to better understand user bias.The primary findings of our study include: I) Participants from minority ethnic groups have stronger feeling regarding being impacted by inequitable software/AI, II) Participants with higher interest in AI/ML have a higher belief for the priority of unbiased software, III) Users do not act in an equitable manner, as avatars with ‘dark’ skin color are less likely to be selected, and IV) Participants from different demographic groups exhibit similar behavior bias. The created experiential lab activity may be executed using only a browser and internet connection, and is publicly available on our project website: https://all.rit.edu.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信