{"title":"Exploring the influence of technology regulatory policy instruments on public acceptance of algorithm recommender systems","authors":"Yue Guo , Sirui Li , Lei Zhou , Yu Sun","doi":"10.1016/j.giq.2024.101940","DOIUrl":null,"url":null,"abstract":"<div><p>The application of algorithm recommender systems introduces several potential risks, including privacy infringement, discriminatory outcomes, and opacity. Governments worldwide have introduced regulatory policy instruments with varying degrees of coerciveness. However, few studies have examined the impact of regulatory policy instruments on public acceptance. This study investigates the nuanced ways in which regulatory policy instruments spanning a spectrum of coerciveness shape public perceptions of privacy risks and acceptance of algorithm recommender systems. Through a survey experiment involving a sample size of 2015 participants, we created three distinct categories of regulatory policy instruments to serve as experimental treatments and one control group with no intervention. Our empirical findings illustrate the substantial treatment effects stemming from regulatory policy instruments. Furthermore, these regulatory policy instruments, infused with varying degrees of coerciveness, assume a dynamic moderating role, initially mitigating and subsequently intensifying the adverse influence of perceived privacy risks on public acceptance. By synthesizing two streams of literature on technology acceptance and regulatory policy, our research underscores the fact that regulatory policy instruments have the potential to reshape the public's perceived risk associated with technology use, exerting a considerable influence on public technology acceptance. These findings have implications for governments seeking effective, fine-grained algorithmic governance based on individual behavior. Policymakers should consider public risk perceptions and technology acceptance when designing regulatory policies for enterprises.</p></div>","PeriodicalId":48258,"journal":{"name":"Government Information Quarterly","volume":"41 3","pages":"Article 101940"},"PeriodicalIF":7.8000,"publicationDate":"2024-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Government Information Quarterly","FirstCategoryId":"91","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0740624X24000327","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The application of algorithm recommender systems introduces several potential risks, including privacy infringement, discriminatory outcomes, and opacity. Governments worldwide have introduced regulatory policy instruments with varying degrees of coerciveness. However, few studies have examined the impact of regulatory policy instruments on public acceptance. This study investigates the nuanced ways in which regulatory policy instruments spanning a spectrum of coerciveness shape public perceptions of privacy risks and acceptance of algorithm recommender systems. Through a survey experiment involving a sample size of 2015 participants, we created three distinct categories of regulatory policy instruments to serve as experimental treatments and one control group with no intervention. Our empirical findings illustrate the substantial treatment effects stemming from regulatory policy instruments. Furthermore, these regulatory policy instruments, infused with varying degrees of coerciveness, assume a dynamic moderating role, initially mitigating and subsequently intensifying the adverse influence of perceived privacy risks on public acceptance. By synthesizing two streams of literature on technology acceptance and regulatory policy, our research underscores the fact that regulatory policy instruments have the potential to reshape the public's perceived risk associated with technology use, exerting a considerable influence on public technology acceptance. These findings have implications for governments seeking effective, fine-grained algorithmic governance based on individual behavior. Policymakers should consider public risk perceptions and technology acceptance when designing regulatory policies for enterprises.
期刊介绍:
Government Information Quarterly (GIQ) delves into the convergence of policy, information technology, government, and the public. It explores the impact of policies on government information flows, the role of technology in innovative government services, and the dynamic between citizens and governing bodies in the digital age. GIQ serves as a premier journal, disseminating high-quality research and insights that bridge the realms of policy, information technology, government, and public engagement.