Not our kind of crowd! How partisan bias distorts perceptions of political bots on Twitter (now X)

IF 3.2 2区 心理学 Q1 PSYCHOLOGY, SOCIAL
Adrian Lüders, Stefan Reiss, Alejandro Dinkelberg, Pádraig MacCarron, Michael Quayle
{"title":"Not our kind of crowd! How partisan bias distorts perceptions of political bots on Twitter (now X)","authors":"Adrian Lüders, Stefan Reiss, Alejandro Dinkelberg, Pádraig MacCarron, Michael Quayle","doi":"10.1111/bjso.12794","DOIUrl":null,"url":null,"abstract":"Social bots, employed to manipulate public opinion, pose a novel threat to digital societies. Existing bot research has emphasized technological aspects while neglecting psychological factors shaping human–bot interactions. This research addresses this gap within the context of the US‐American electorate. Two datasets provide evidence that partisanship distorts (a) online users' representation of bots, (b) their ability to identify them, and (c) their intentions to interact with them. Study 1 explores global bot perceptions on through survey data from <jats:italic>N</jats:italic> = 452 Twitter (now X) users. Results suggest that users tend to attribute bot‐related dangers to political adversaries, rather than recognizing bots as a shared threat to political discourse. Study 2 (<jats:italic>N</jats:italic> = 619) evaluates the consequences of such misrepresentations for the quality of online interactions. In an online experiment, participants were asked to differentiate between human and bot profiles. Results indicate that partisan leanings explained systematic judgement errors. The same data suggest that participants aim to avoid interacting with bots. However, biased judgements may undermine this motivation in praxis. In sum, the presented findings underscore the importance of interdisciplinary strategies that consider technological and human factors to address the threats posed by bots in a rapidly evolving digital landscape.","PeriodicalId":48304,"journal":{"name":"British Journal of Social Psychology","volume":"7 1","pages":""},"PeriodicalIF":3.2000,"publicationDate":"2024-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Journal of Social Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1111/bjso.12794","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
引用次数: 0

Abstract

Social bots, employed to manipulate public opinion, pose a novel threat to digital societies. Existing bot research has emphasized technological aspects while neglecting psychological factors shaping human–bot interactions. This research addresses this gap within the context of the US‐American electorate. Two datasets provide evidence that partisanship distorts (a) online users' representation of bots, (b) their ability to identify them, and (c) their intentions to interact with them. Study 1 explores global bot perceptions on through survey data from N = 452 Twitter (now X) users. Results suggest that users tend to attribute bot‐related dangers to political adversaries, rather than recognizing bots as a shared threat to political discourse. Study 2 (N = 619) evaluates the consequences of such misrepresentations for the quality of online interactions. In an online experiment, participants were asked to differentiate between human and bot profiles. Results indicate that partisan leanings explained systematic judgement errors. The same data suggest that participants aim to avoid interacting with bots. However, biased judgements may undermine this motivation in praxis. In sum, the presented findings underscore the importance of interdisciplinary strategies that consider technological and human factors to address the threats posed by bots in a rapidly evolving digital landscape.
不是我们这类人!党派偏见如何扭曲人们对 Twitter 上政治机器人的看法(现在 X)
社交机器人被用来操纵公众舆论,对数字社会构成了新的威胁。现有的机器人研究强调技术层面,却忽视了影响人与机器人互动的心理因素。本研究以美国选民为背景,填补了这一空白。两个数据集提供的证据表明,党派主义扭曲了(a)在线用户对机器人的表述,(b)他们识别机器人的能力,以及(c)他们与机器人互动的意图。研究 1 通过对 N = 452 名 Twitter(现为 X)用户的调查数据,探讨了全球用户对机器人的看法。结果表明,用户倾向于将与僵尸相关的危险归咎于政治对手,而不是将僵尸视为政治言论的共同威胁。研究 2(N = 619)评估了这种错误表述对在线互动质量的影响。在一项在线实验中,参与者被要求区分人类和机器人的特征。结果表明,党派倾向解释了系统性判断错误。同样的数据表明,参与者的目标是避免与机器人互动。然而,有偏见的判断可能会在实践过程中削弱这种动机。总之,本文的研究结果强调了跨学科战略的重要性,即考虑技术和人为因素,以应对机器人在快速发展的数字环境中带来的威胁。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.50
自引率
7.40%
发文量
85
期刊介绍: The British Journal of Social Psychology publishes work from scholars based in all parts of the world, and manuscripts that present data on a wide range of populations inside and outside the UK. It publishes original papers in all areas of social psychology including: • social cognition • attitudes • group processes • social influence • intergroup relations • self and identity • nonverbal communication • social psychological aspects of personality, affect and emotion • language and discourse Submissions addressing these topics from a variety of approaches and methods, both quantitative and qualitative are welcomed. We publish papers of the following kinds: • empirical papers that address theoretical issues; • theoretical papers, including analyses of existing social psychological theories and presentations of theoretical innovations, extensions, or integrations; • review papers that provide an evaluation of work within a given area of social psychology and that present proposals for further research in that area; • methodological papers concerning issues that are particularly relevant to a wide range of social psychologists; • an invited agenda article as the first article in the first part of every volume. The editorial team aims to handle papers as efficiently as possible. In 2016, papers were triaged within less than a week, and the average turnaround time from receipt of the manuscript to first decision sent back to the authors was 47 days.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信