{"title":"搜索引擎对女性和移民的偏见性自动建议会导致招聘歧视:实验调查","authors":"Cong Lin , Wang Liao , Na Ta","doi":"10.1016/j.chb.2024.108408","DOIUrl":null,"url":null,"abstract":"<div><p>This article addresses the effects of biased search engine autosuggestions on hiring discrimination against females and immigrants. In two pre-registered experiments (<em>N</em><sub>1</sub> = 266, <em>N</em><sub>2</sub> = 263), we exposed the participants to biased autosuggestions against these two groups in certain occupations (female lapidaries in Study 1 and immigrant rideshare drivers in Study 2) and measured the hiring preference. We found the biased autosuggestions affected the hiring preference, contingent on stereotypical beliefs of the respective groups: When the group was perceived less warm (e.g., females less warm than males), the biased autosuggestions increased users’ hiring discrimination against the group. In contrast, when the group was perceived warmer (e.g., immigrants warmer than non-immigrant citizens), the biased autosuggestions triggered a reactance response, reducing their hiring discrimination.</p></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"161 ","pages":"Article 108408"},"PeriodicalIF":9.0000,"publicationDate":"2024-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Biased search engine autosuggestions against females and immigrants can lead to hiring discrimination: An experimental investigation\",\"authors\":\"Cong Lin , Wang Liao , Na Ta\",\"doi\":\"10.1016/j.chb.2024.108408\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This article addresses the effects of biased search engine autosuggestions on hiring discrimination against females and immigrants. In two pre-registered experiments (<em>N</em><sub>1</sub> = 266, <em>N</em><sub>2</sub> = 263), we exposed the participants to biased autosuggestions against these two groups in certain occupations (female lapidaries in Study 1 and immigrant rideshare drivers in Study 2) and measured the hiring preference. We found the biased autosuggestions affected the hiring preference, contingent on stereotypical beliefs of the respective groups: When the group was perceived less warm (e.g., females less warm than males), the biased autosuggestions increased users’ hiring discrimination against the group. In contrast, when the group was perceived warmer (e.g., immigrants warmer than non-immigrant citizens), the biased autosuggestions triggered a reactance response, reducing their hiring discrimination.</p></div>\",\"PeriodicalId\":48471,\"journal\":{\"name\":\"Computers in Human Behavior\",\"volume\":\"161 \",\"pages\":\"Article 108408\"},\"PeriodicalIF\":9.0000,\"publicationDate\":\"2024-08-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in Human Behavior\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0747563224002760\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563224002760","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
Biased search engine autosuggestions against females and immigrants can lead to hiring discrimination: An experimental investigation
This article addresses the effects of biased search engine autosuggestions on hiring discrimination against females and immigrants. In two pre-registered experiments (N1 = 266, N2 = 263), we exposed the participants to biased autosuggestions against these two groups in certain occupations (female lapidaries in Study 1 and immigrant rideshare drivers in Study 2) and measured the hiring preference. We found the biased autosuggestions affected the hiring preference, contingent on stereotypical beliefs of the respective groups: When the group was perceived less warm (e.g., females less warm than males), the biased autosuggestions increased users’ hiring discrimination against the group. In contrast, when the group was perceived warmer (e.g., immigrants warmer than non-immigrant citizens), the biased autosuggestions triggered a reactance response, reducing their hiring discrimination.
期刊介绍:
Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.