Becoming transparent and feeling helpless: Expressions of vulnerability and inequality in online critiques of algorithmic surveillance in China

IF 1.2 4区 社会学 Q2 AREA STUDIES
Asian Journal of Social Science Pub Date : 2025-12-01 Epub Date: 2025-10-21 DOI:10.1016/j.ajss.2025.100214
Haili Li , Genia Kostka
{"title":"Becoming transparent and feeling helpless: Expressions of vulnerability and inequality in online critiques of algorithmic surveillance in China","authors":"Haili Li ,&nbsp;Genia Kostka","doi":"10.1016/j.ajss.2025.100214","DOIUrl":null,"url":null,"abstract":"<div><div>During the COVID-19 pandemic, numerous countries, including China, deployed digital surveillance technologies as part of broader social governance strategies. While these technologies offered certain benefits, their widespread application also posed risks, including algorithmic bias and privacy infringement. This study examines critical discussions (or critiques) on Chinese social media concerning various problems induced by algorithmic surveillance technologies such as the Health Code and Travel Code during the pandemic. Employing computational and qualitative textual analysis, our findings highlight recurring accounts of algorithmic and technical failures that users encountered when interacting with surveillance technologies. These disruptions exposed individuals to heightened algorithmic vulnerability and intensified existing inequalities, particularly through unequal treatment and negative emotional experiences. Our research further implies that the critical discussions often framed the Chinese government’s massive deployment of algorithmic surveillance technologies as exacerbating pre-existing issues, such as the digital divide and social bias, especially for vulnerable groups like older people. Meanwhile, our analysis of online critiques highlights growing concerns and skepticism among some users toward both algorithmic technologies and state governance.</div></div>","PeriodicalId":45675,"journal":{"name":"Asian Journal of Social Science","volume":"53 4","pages":"Article 100214"},"PeriodicalIF":1.2000,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Asian Journal of Social Science","FirstCategoryId":"90","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1568484925000322","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/10/21 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"AREA STUDIES","Score":null,"Total":0}
引用次数: 0

Abstract

During the COVID-19 pandemic, numerous countries, including China, deployed digital surveillance technologies as part of broader social governance strategies. While these technologies offered certain benefits, their widespread application also posed risks, including algorithmic bias and privacy infringement. This study examines critical discussions (or critiques) on Chinese social media concerning various problems induced by algorithmic surveillance technologies such as the Health Code and Travel Code during the pandemic. Employing computational and qualitative textual analysis, our findings highlight recurring accounts of algorithmic and technical failures that users encountered when interacting with surveillance technologies. These disruptions exposed individuals to heightened algorithmic vulnerability and intensified existing inequalities, particularly through unequal treatment and negative emotional experiences. Our research further implies that the critical discussions often framed the Chinese government’s massive deployment of algorithmic surveillance technologies as exacerbating pre-existing issues, such as the digital divide and social bias, especially for vulnerable groups like older people. Meanwhile, our analysis of online critiques highlights growing concerns and skepticism among some users toward both algorithmic technologies and state governance.
变得透明和感到无助:中国网络上对算法监控的批评中表达的脆弱性和不平等
在2019冠状病毒病大流行期间,包括中国在内的许多国家将数字监测技术作为更广泛的社会治理战略的一部分。虽然这些技术带来了一定的好处,但它们的广泛应用也带来了风险,包括算法偏见和隐私侵犯。本研究考察了疫情期间,中国社交媒体上关于算法监测技术(如健康码和旅行码)引发的各种问题的批判性讨论(或批评)。通过计算和定性文本分析,我们的研究结果强调了用户在与监控技术交互时遇到的算法和技术故障的反复出现。这些干扰使个人更容易受到算法的影响,并加剧了现有的不平等,特别是通过不平等待遇和负面情绪体验。我们的研究进一步表明,批评性讨论往往认为,中国政府大规模部署算法监控技术加剧了已有的问题,如数字鸿沟和社会偏见,尤其是对老年人等弱势群体。与此同时,我们对网上批评的分析凸显了一些用户对算法技术和国家治理日益增长的担忧和怀疑。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
1.20
自引率
0.00%
发文量
53
期刊介绍: The Asian Journal of Social Science is a principal outlet for scholarly articles on Asian societies published by the Department of Sociology, National University of Singapore. AJSS provides a unique forum for theoretical debates and empirical analyses that move away from narrow disciplinary focus. It is committed to comparative research and articles that speak to cases beyond the traditional concerns of area and single-country studies. AJSS strongly encourages transdisciplinary analysis of contemporary and historical social change in Asia by offering a meeting space for international scholars across the social sciences, including anthropology, cultural studies, economics, geography, history, political science, psychology, and sociology. AJSS also welcomes humanities-oriented articles that speak to pertinent social issues. AJSS publishes internationally peer-reviewed research articles, special thematic issues and shorter symposiums. AJSS also publishes book reviews and review essays, research notes on Asian societies, and short essays of special interest to students of the region.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书