搜索引擎对女性和移民的偏见性自动建议会导致招聘歧视:实验调查

IF 9 1区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL
{"title":"搜索引擎对女性和移民的偏见性自动建议会导致招聘歧视:实验调查","authors":"","doi":"10.1016/j.chb.2024.108408","DOIUrl":null,"url":null,"abstract":"<div><p>This article addresses the effects of biased search engine autosuggestions on hiring discrimination against females and immigrants. In two pre-registered experiments (<em>N</em><sub>1</sub> = 266, <em>N</em><sub>2</sub> = 263), we exposed the participants to biased autosuggestions against these two groups in certain occupations (female lapidaries in Study 1 and immigrant rideshare drivers in Study 2) and measured the hiring preference. We found the biased autosuggestions affected the hiring preference, contingent on stereotypical beliefs of the respective groups: When the group was perceived less warm (e.g., females less warm than males), the biased autosuggestions increased users’ hiring discrimination against the group. In contrast, when the group was perceived warmer (e.g., immigrants warmer than non-immigrant citizens), the biased autosuggestions triggered a reactance response, reducing their hiring discrimination.</p></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":null,"pages":null},"PeriodicalIF":9.0000,"publicationDate":"2024-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Biased search engine autosuggestions against females and immigrants can lead to hiring discrimination: An experimental investigation\",\"authors\":\"\",\"doi\":\"10.1016/j.chb.2024.108408\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This article addresses the effects of biased search engine autosuggestions on hiring discrimination against females and immigrants. In two pre-registered experiments (<em>N</em><sub>1</sub> = 266, <em>N</em><sub>2</sub> = 263), we exposed the participants to biased autosuggestions against these two groups in certain occupations (female lapidaries in Study 1 and immigrant rideshare drivers in Study 2) and measured the hiring preference. We found the biased autosuggestions affected the hiring preference, contingent on stereotypical beliefs of the respective groups: When the group was perceived less warm (e.g., females less warm than males), the biased autosuggestions increased users’ hiring discrimination against the group. In contrast, when the group was perceived warmer (e.g., immigrants warmer than non-immigrant citizens), the biased autosuggestions triggered a reactance response, reducing their hiring discrimination.</p></div>\",\"PeriodicalId\":48471,\"journal\":{\"name\":\"Computers in Human Behavior\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":9.0000,\"publicationDate\":\"2024-08-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in Human Behavior\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0747563224002760\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563224002760","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

摘要

本文探讨了有偏见的搜索引擎自动建议对女性和移民雇佣歧视的影响。在两个预先登记的实验中(N1 = 266,N2 = 263),我们向参与者暴露了针对这两个群体某些职业的有偏见的自动建议(研究 1 中为女性青石板工,研究 2 中为移民共享单车司机),并测量了雇用偏好。我们发现,有偏见的自动建议会影响雇用偏好,这取决于对相应群体的刻板印象:当认为该群体不那么热情时(例如,女性比男性不那么热情),有偏见的自动建议会增加用户对该群体的雇佣歧视。与此相反,当用户认为该群体更温暖时(例如,移民比非移民公民更温暖),有偏见的自动建议会引发反应,减少用户的雇佣歧视。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Biased search engine autosuggestions against females and immigrants can lead to hiring discrimination: An experimental investigation

This article addresses the effects of biased search engine autosuggestions on hiring discrimination against females and immigrants. In two pre-registered experiments (N1 = 266, N2 = 263), we exposed the participants to biased autosuggestions against these two groups in certain occupations (female lapidaries in Study 1 and immigrant rideshare drivers in Study 2) and measured the hiring preference. We found the biased autosuggestions affected the hiring preference, contingent on stereotypical beliefs of the respective groups: When the group was perceived less warm (e.g., females less warm than males), the biased autosuggestions increased users’ hiring discrimination against the group. In contrast, when the group was perceived warmer (e.g., immigrants warmer than non-immigrant citizens), the biased autosuggestions triggered a reactance response, reducing their hiring discrimination.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
19.10
自引率
4.00%
发文量
381
审稿时长
40 days
期刊介绍: Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信