“数字个性化效应”(DPE):对个性化内容增加在线操纵影响的可能程度的量化

IF 9 1区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL
Robert Epstein, Amanda Newland, Li Yu Tang
{"title":"“数字个性化效应”(DPE):对个性化内容增加在线操纵影响的可能程度的量化","authors":"Robert Epstein,&nbsp;Amanda Newland,&nbsp;Li Yu Tang","doi":"10.1016/j.chb.2025.108578","DOIUrl":null,"url":null,"abstract":"<div><div>In recent published reports on the “search engine manipulation effect,” the “targeted messaging effect,” and the “answer bot effect,” exposure to biased content produced significant shifts in the opinions and voting preferences of undecided voters. In the present study, these effects were replicated on simulations of the Google search engine, X (f.k.a., Twitter), and Alexa, and biased content was also personalized. Participants (all from the US) were first asked to rank order news and other sources according to how much they preferred them. Then they were randomly assigned either to a group in which content about the 2019 Australian election for Prime Minister would be received from highly preferred or least preferred sources. Participants were also randomly assigned either to a group in which the content was highly biased to favor Candidate-A or to favor Candidate-B. In all three experiments, the voting preferences of participants who saw biased content – that is, content that favored Candidate A or Candidate B – apparently coming from least preferred sources shifted by relatively small amounts (17.1% in Experiment 1, 21.8% in Experiment 2, and 39.3% in Experiment 3); whereas, the voting preferences of participants who saw that same biased content apparently coming from highly preferred sources shifted by significantly and substantially larger amounts (67.7% in Experiment 1, 71.9% in Experiment 2, and 65.9% in Experiment 3). All shifts occurred in the direction of the candidate favored by the bias. We conclude that personalization of biased content can greatly increase the impact of such content.</div></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"166 ","pages":"Article 108578"},"PeriodicalIF":9.0000,"publicationDate":"2025-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The “digital personalization effect” (DPE): A quantification of the possible extent to which personalizing content can increase the impact of online manipulations\",\"authors\":\"Robert Epstein,&nbsp;Amanda Newland,&nbsp;Li Yu Tang\",\"doi\":\"10.1016/j.chb.2025.108578\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>In recent published reports on the “search engine manipulation effect,” the “targeted messaging effect,” and the “answer bot effect,” exposure to biased content produced significant shifts in the opinions and voting preferences of undecided voters. In the present study, these effects were replicated on simulations of the Google search engine, X (f.k.a., Twitter), and Alexa, and biased content was also personalized. Participants (all from the US) were first asked to rank order news and other sources according to how much they preferred them. Then they were randomly assigned either to a group in which content about the 2019 Australian election for Prime Minister would be received from highly preferred or least preferred sources. Participants were also randomly assigned either to a group in which the content was highly biased to favor Candidate-A or to favor Candidate-B. In all three experiments, the voting preferences of participants who saw biased content – that is, content that favored Candidate A or Candidate B – apparently coming from least preferred sources shifted by relatively small amounts (17.1% in Experiment 1, 21.8% in Experiment 2, and 39.3% in Experiment 3); whereas, the voting preferences of participants who saw that same biased content apparently coming from highly preferred sources shifted by significantly and substantially larger amounts (67.7% in Experiment 1, 71.9% in Experiment 2, and 65.9% in Experiment 3). All shifts occurred in the direction of the candidate favored by the bias. We conclude that personalization of biased content can greatly increase the impact of such content.</div></div>\",\"PeriodicalId\":48471,\"journal\":{\"name\":\"Computers in Human Behavior\",\"volume\":\"166 \",\"pages\":\"Article 108578\"},\"PeriodicalIF\":9.0000,\"publicationDate\":\"2025-01-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in Human Behavior\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0747563225000251\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563225000251","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

摘要

在最近发表的关于“搜索引擎操纵效应”、“定向消息效应”和“回答机器人效应”的报告中,暴露于有偏见的内容会使尚未决定的选民的观点和投票偏好发生重大变化。在本研究中,这些影响在谷歌搜索引擎、X(即Twitter)和Alexa的模拟中被复制,并且有偏见的内容也被个性化。参与者(全部来自美国)首先被要求根据他们对新闻和其他来源的喜爱程度对它们进行排序。然后,他们被随机分配到一个小组,其中关于2019年澳大利亚总理选举的内容将从最受欢迎或最不受欢迎的来源收到。参与者也被随机分配到一个内容高度偏向候选人a或候选人b的小组。在所有三个实验中,看到有偏见的内容(即支持候选人A或候选人B的内容)的参与者的投票偏好明显来自最不受欢迎的来源,变化幅度相对较小(实验1为17.1%,实验2为21.8%,实验3为39.3%);然而,当参与者看到同样有偏见的内容明显来自高度偏好的来源时,他们的投票偏好发生了显著且明显更大的变化(实验1为67.7%,实验2为71.9%,实验3为65.9%)。所有的变化都发生在受偏见青睐的候选人的方向上。我们的结论是,有偏见的内容的个性化可以大大增加这些内容的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The “digital personalization effect” (DPE): A quantification of the possible extent to which personalizing content can increase the impact of online manipulations
In recent published reports on the “search engine manipulation effect,” the “targeted messaging effect,” and the “answer bot effect,” exposure to biased content produced significant shifts in the opinions and voting preferences of undecided voters. In the present study, these effects were replicated on simulations of the Google search engine, X (f.k.a., Twitter), and Alexa, and biased content was also personalized. Participants (all from the US) were first asked to rank order news and other sources according to how much they preferred them. Then they were randomly assigned either to a group in which content about the 2019 Australian election for Prime Minister would be received from highly preferred or least preferred sources. Participants were also randomly assigned either to a group in which the content was highly biased to favor Candidate-A or to favor Candidate-B. In all three experiments, the voting preferences of participants who saw biased content – that is, content that favored Candidate A or Candidate B – apparently coming from least preferred sources shifted by relatively small amounts (17.1% in Experiment 1, 21.8% in Experiment 2, and 39.3% in Experiment 3); whereas, the voting preferences of participants who saw that same biased content apparently coming from highly preferred sources shifted by significantly and substantially larger amounts (67.7% in Experiment 1, 71.9% in Experiment 2, and 65.9% in Experiment 3). All shifts occurred in the direction of the candidate favored by the bias. We conclude that personalization of biased content can greatly increase the impact of such content.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
19.10
自引率
4.00%
发文量
381
审稿时长
40 days
期刊介绍: Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信