Negative consequences of information gatekeeping through algorithmic technologies: An Annual Review of Information Science and Technology (ARIST) paper
IF 2.8 2区 管理学Q2 COMPUTER SCIENCE, INFORMATION SYSTEMS
{"title":"Negative consequences of information gatekeeping through algorithmic technologies: An Annual Review of Information Science and Technology (ARIST) paper","authors":"Devendra Potnis, Iman Tahamtan, Luke McDonald","doi":"10.1002/asi.24955","DOIUrl":null,"url":null,"abstract":"<p>Rarely any study investigates <i>how</i> information gatekeeping through the solutions and services enabled by algorithms, hereafter referred to as algorithmic technologies (AT), creates negative consequences for the users. To fill this gap, this state-of-the-art review analyzes 229 relevant articles from diverse academic disciplines. We employed thematic analysis to identify, analyze, classify, and reveal the chain reactions among the negative consequences. We found that the gatekeeping of information (text, audio, video, and graphics) through AT like artificial intelligence (e.g., chatbots, large language models, machine learning, robots), decision support systems (used by banks, grocery stores, police, etc.), hashtags, online gaming platforms, search technologies (e.g., voice assistants, ChatGPT), and Web 3.0 (e.g., Internet of Things, non-fungible tokens) creates or reinforces cognitive vulnerability, economic divide and financial vulnerability, information divide, physical vulnerability, psychological vulnerability, and social divide virtually and in the offline world. Theoretical implications include the hierarchical depiction of the chain reactions among the primary, secondary, and tertiary divides and vulnerabilities. To mitigate these negative consequences, we call for concerted efforts using top-down strategies for governments, organizations, and technology experts to attain more transparency, accountability, ethical behavior, and moral practices, and bottom-up strategies for users to be more alert, discerning, critical, and proactive.</p>","PeriodicalId":48810,"journal":{"name":"Journal of the Association for Information Science and Technology","volume":"76 1","pages":"262-288"},"PeriodicalIF":2.8000,"publicationDate":"2024-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Association for Information Science and Technology","FirstCategoryId":"91","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/asi.24955","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Rarely any study investigates how information gatekeeping through the solutions and services enabled by algorithms, hereafter referred to as algorithmic technologies (AT), creates negative consequences for the users. To fill this gap, this state-of-the-art review analyzes 229 relevant articles from diverse academic disciplines. We employed thematic analysis to identify, analyze, classify, and reveal the chain reactions among the negative consequences. We found that the gatekeeping of information (text, audio, video, and graphics) through AT like artificial intelligence (e.g., chatbots, large language models, machine learning, robots), decision support systems (used by banks, grocery stores, police, etc.), hashtags, online gaming platforms, search technologies (e.g., voice assistants, ChatGPT), and Web 3.0 (e.g., Internet of Things, non-fungible tokens) creates or reinforces cognitive vulnerability, economic divide and financial vulnerability, information divide, physical vulnerability, psychological vulnerability, and social divide virtually and in the offline world. Theoretical implications include the hierarchical depiction of the chain reactions among the primary, secondary, and tertiary divides and vulnerabilities. To mitigate these negative consequences, we call for concerted efforts using top-down strategies for governments, organizations, and technology experts to attain more transparency, accountability, ethical behavior, and moral practices, and bottom-up strategies for users to be more alert, discerning, critical, and proactive.
期刊介绍:
The Journal of the Association for Information Science and Technology (JASIST) is a leading international forum for peer-reviewed research in information science. For more than half a century, JASIST has provided intellectual leadership by publishing original research that focuses on the production, discovery, recording, storage, representation, retrieval, presentation, manipulation, dissemination, use, and evaluation of information and on the tools and techniques associated with these processes.
The Journal welcomes rigorous work of an empirical, experimental, ethnographic, conceptual, historical, socio-technical, policy-analytic, or critical-theoretical nature. JASIST also commissions in-depth review articles (“Advances in Information Science”) and reviews of print and other media.