Pseudo-public political speech: Democratic implications of the Cambridge Analytica scandal

Inf. Polity Pub Date : 2018-12-10 DOI:10.3233/IP-180009
J. Heawood
{"title":"Pseudo-public political speech: Democratic implications of the Cambridge Analytica scandal","authors":"J. Heawood","doi":"10.3233/IP-180009","DOIUrl":null,"url":null,"abstract":"On 29 July 2018, the House of Commons Select Committee on Digital, Culture, Media and Sport published a report on ‘fake news’ (DCMS, 2018). Oddly, but perhaps appropriately, the report wasn’t actually about fake news. The Select Committee explained that, although they had begun by looking at fake news, they had been diverted by a series of stories about a company called Cambridge Analytica1 that were published in the Observer earlier this year. In those articles, Carole Cadwalladr,2 the Observer journalist, had revealed that Cambridge Analytica, or companies linked to Cambridge Analytica, had used the personal data of about 200,000 Facebook users to build up detailed psychological profiles of up to 87 million Facebook users. Whilst the initial 200,000 users had voluntarily completed a personality test, they had not necessarily known how their answers would be used, and the 87 million users who were profiled had most certainly not given their informed consent for this (Cadwalladr, 2018). Cambridge Analytica used this massive database to help political campaigners in the United Kingdom, the United States and other countries to target Facebook users with highly specific messages. This ‘microtargeting’ has been defined as ‘a type of personalised communication that involves collecting information about people, and using that information to show them targeted political advertisements’ (Borgesius et al., 2018: 81). Cambridge Analytica used a profiling tool called OCEAN to categorise Facebook users on the basis of their ‘Openness’, ‘Conscientiousness’, ‘Extraversion’, ‘Agreeableness’ and ‘Neuroticism’. They then helped their clients to target users with the most effective messages. The Select Committee observed that Cambridge Analytica ‘might play on the fears of someone who could be frightened into believing that they needed the right to have a gun to protect their home from intruders’","PeriodicalId":418875,"journal":{"name":"Inf. Polity","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Inf. Polity","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3233/IP-180009","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 22

Abstract

On 29 July 2018, the House of Commons Select Committee on Digital, Culture, Media and Sport published a report on ‘fake news’ (DCMS, 2018). Oddly, but perhaps appropriately, the report wasn’t actually about fake news. The Select Committee explained that, although they had begun by looking at fake news, they had been diverted by a series of stories about a company called Cambridge Analytica1 that were published in the Observer earlier this year. In those articles, Carole Cadwalladr,2 the Observer journalist, had revealed that Cambridge Analytica, or companies linked to Cambridge Analytica, had used the personal data of about 200,000 Facebook users to build up detailed psychological profiles of up to 87 million Facebook users. Whilst the initial 200,000 users had voluntarily completed a personality test, they had not necessarily known how their answers would be used, and the 87 million users who were profiled had most certainly not given their informed consent for this (Cadwalladr, 2018). Cambridge Analytica used this massive database to help political campaigners in the United Kingdom, the United States and other countries to target Facebook users with highly specific messages. This ‘microtargeting’ has been defined as ‘a type of personalised communication that involves collecting information about people, and using that information to show them targeted political advertisements’ (Borgesius et al., 2018: 81). Cambridge Analytica used a profiling tool called OCEAN to categorise Facebook users on the basis of their ‘Openness’, ‘Conscientiousness’, ‘Extraversion’, ‘Agreeableness’ and ‘Neuroticism’. They then helped their clients to target users with the most effective messages. The Select Committee observed that Cambridge Analytica ‘might play on the fears of someone who could be frightened into believing that they needed the right to have a gun to protect their home from intruders’
伪公开政治演讲:剑桥分析公司丑闻的民主含义
2018年7月29日,下议院数字、文化、媒体和体育特别委员会发布了一份关于“假新闻”的报告(DCMS, 2018)。奇怪的是,这篇报道实际上并不是关于假新闻的,但也许是恰当的。特别委员会解释说,尽管他们一开始是在调查假新闻,但今年早些时候《观察家报》上发表的一系列关于剑桥分析公司的报道转移了他们的注意力。在这些文章中,《观察家报》记者卡罗尔·卡德瓦拉德(Carole Cadwalladr)披露,剑桥分析公司或与剑桥分析公司有关联的公司,使用了大约20万Facebook用户的个人数据,建立了多达8700万Facebook用户的详细心理档案。虽然最初的20万用户是自愿完成性格测试的,但他们并不一定知道他们的答案将如何被使用,而被分析的8700万用户几乎肯定没有对此表示知情同意(Cadwalladr, 2018)。剑桥分析公司利用这个庞大的数据库,帮助英国、美国和其他国家的政治活动家向Facebook用户发送高度具体的信息。这种“微目标”被定义为“一种个性化的沟通,涉及收集人们的信息,并使用这些信息向他们展示有针对性的政治广告”(Borgesius等人,2018:81)。剑桥分析公司使用了一种名为OCEAN的分析工具,根据“开放性”、“严谨性”、“外向性”、“宜人性”和“神经质”对Facebook用户进行分类。然后,他们帮助客户向目标用户发送最有效的信息。特别委员会指出,剑桥分析公司“可能会利用一些人的恐惧,让他们相信自己需要拥有枪支的权利来保护自己的家不受入侵者的侵犯”。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信