Cong Lin , Yuxin Gao , Na Ta , Kaiyu Li , Hongyao Fu
{"title":"困在搜索框中:对搜索引擎自动完成预测中的算法偏差的检验","authors":"Cong Lin , Yuxin Gao , Na Ta , Kaiyu Li , Hongyao Fu","doi":"10.1016/j.tele.2023.102068","DOIUrl":null,"url":null,"abstract":"<div><p>This paper examines the autocomplete algorithmic bias of leading search engines against three sensitive attributes: gender, race, and sexual orientation. By simulating search query prefixes and calling search engine APIs, 106,896 autocomplete predictions were collected, and their semantic toxicity scores as measures of negative algorithmic bias were computed based on machine learning models. Results indicate that search engine autocomplete algorithmic bias is overall consistent with long-standing societal discrimination. Historically disadvantaged groups such as the female, the Black, and the homosexual suffer higher levels of negative algorithmic bias. Moreover, the degree of algorithmic bias varies across topic categories. Implications about the search engine mediatization, mechanisms and consequences of autocomplete algorithmic bias are discussed.</p></div>","PeriodicalId":48257,"journal":{"name":"Telematics and Informatics","volume":"85 ","pages":"Article 102068"},"PeriodicalIF":7.6000,"publicationDate":"2023-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Trapped in the search box: An examination of algorithmic bias in search engine autocomplete predictions\",\"authors\":\"Cong Lin , Yuxin Gao , Na Ta , Kaiyu Li , Hongyao Fu\",\"doi\":\"10.1016/j.tele.2023.102068\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This paper examines the autocomplete algorithmic bias of leading search engines against three sensitive attributes: gender, race, and sexual orientation. By simulating search query prefixes and calling search engine APIs, 106,896 autocomplete predictions were collected, and their semantic toxicity scores as measures of negative algorithmic bias were computed based on machine learning models. Results indicate that search engine autocomplete algorithmic bias is overall consistent with long-standing societal discrimination. Historically disadvantaged groups such as the female, the Black, and the homosexual suffer higher levels of negative algorithmic bias. Moreover, the degree of algorithmic bias varies across topic categories. Implications about the search engine mediatization, mechanisms and consequences of autocomplete algorithmic bias are discussed.</p></div>\",\"PeriodicalId\":48257,\"journal\":{\"name\":\"Telematics and Informatics\",\"volume\":\"85 \",\"pages\":\"Article 102068\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2023-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Telematics and Informatics\",\"FirstCategoryId\":\"91\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0736585323001326\",\"RegionNum\":2,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"INFORMATION SCIENCE & LIBRARY SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Telematics and Informatics","FirstCategoryId":"91","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0736585323001326","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
Trapped in the search box: An examination of algorithmic bias in search engine autocomplete predictions
This paper examines the autocomplete algorithmic bias of leading search engines against three sensitive attributes: gender, race, and sexual orientation. By simulating search query prefixes and calling search engine APIs, 106,896 autocomplete predictions were collected, and their semantic toxicity scores as measures of negative algorithmic bias were computed based on machine learning models. Results indicate that search engine autocomplete algorithmic bias is overall consistent with long-standing societal discrimination. Historically disadvantaged groups such as the female, the Black, and the homosexual suffer higher levels of negative algorithmic bias. Moreover, the degree of algorithmic bias varies across topic categories. Implications about the search engine mediatization, mechanisms and consequences of autocomplete algorithmic bias are discussed.
期刊介绍:
Telematics and Informatics is an interdisciplinary journal that publishes cutting-edge theoretical and methodological research exploring the social, economic, geographic, political, and cultural impacts of digital technologies. It covers various application areas, such as smart cities, sensors, information fusion, digital society, IoT, cyber-physical technologies, privacy, knowledge management, distributed work, emergency response, mobile communications, health informatics, social media's psychosocial effects, ICT for sustainable development, blockchain, e-commerce, and e-government.