#波士顿马拉松#为波士顿祈祷:分析推特上的虚假内容

Aditi Gupta, Hemank Lamba, P. Kumaraguru
{"title":"#波士顿马拉松#为波士顿祈祷:分析推特上的虚假内容","authors":"Aditi Gupta, Hemank Lamba, P. Kumaraguru","doi":"10.1109/ECRS.2013.6805772","DOIUrl":null,"url":null,"abstract":"Online social media has emerged as one of the prominent channels for dissemination of information during real world events. Malicious content is posted online during events, which can result in damage, chaos and monetary losses in the real world. We analyzed one such media i.e. Twitter, for content generated during the event of Boston Marathon Blasts, that occurred on April, 15th, 2013. A lot of fake content and malicious profiles originated on Twitter network during this event. The aim of this work is to perform in-depth characterization of what factors influenced in malicious content and profiles becoming viral. Our results showed that 29% of the most viral content on Twitter, during the Boston crisis were rumors and fake content; while 51% was generic opinions and comments; and rest was true information.We found that large number of users with high social reputation and verified accounts were responsible for spreading the fake content. Next, we used regression prediction model, to verify that, overall impact of all users who propagate the fake content at a given time, can be used to estimate the growth of that content in future. Many malicious accounts were created on Twitter during the Boston event, that were later suspended by Twitter. We identified over six thousand such user profiles, we observed that the creation of such profiles surged considerably right after the blasts occurred. We identified closed community structure and star formation in the interaction network of these suspended profiles amongst themselves.","PeriodicalId":110678,"journal":{"name":"2013 APWG eCrime Researchers Summit","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"141","resultStr":"{\"title\":\"$1.00 per RT #BostonMarathon #PrayForBoston: Analyzing fake content on Twitter\",\"authors\":\"Aditi Gupta, Hemank Lamba, P. Kumaraguru\",\"doi\":\"10.1109/ECRS.2013.6805772\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Online social media has emerged as one of the prominent channels for dissemination of information during real world events. Malicious content is posted online during events, which can result in damage, chaos and monetary losses in the real world. We analyzed one such media i.e. Twitter, for content generated during the event of Boston Marathon Blasts, that occurred on April, 15th, 2013. A lot of fake content and malicious profiles originated on Twitter network during this event. The aim of this work is to perform in-depth characterization of what factors influenced in malicious content and profiles becoming viral. Our results showed that 29% of the most viral content on Twitter, during the Boston crisis were rumors and fake content; while 51% was generic opinions and comments; and rest was true information.We found that large number of users with high social reputation and verified accounts were responsible for spreading the fake content. Next, we used regression prediction model, to verify that, overall impact of all users who propagate the fake content at a given time, can be used to estimate the growth of that content in future. Many malicious accounts were created on Twitter during the Boston event, that were later suspended by Twitter. We identified over six thousand such user profiles, we observed that the creation of such profiles surged considerably right after the blasts occurred. We identified closed community structure and star formation in the interaction network of these suspended profiles amongst themselves.\",\"PeriodicalId\":110678,\"journal\":{\"name\":\"2013 APWG eCrime Researchers Summit\",\"volume\":\"36 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"141\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 APWG eCrime Researchers Summit\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ECRS.2013.6805772\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 APWG eCrime Researchers Summit","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECRS.2013.6805772","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 141

摘要

在线社交媒体已经成为现实世界事件中信息传播的重要渠道之一。恶意内容在活动期间发布在网上,这可能导致现实世界中的破坏、混乱和金钱损失。我们分析了一个这样的媒体,即Twitter,在2013年4月15日发生的波士顿马拉松爆炸事件期间产生的内容。在这次事件中,Twitter网络上出现了大量虚假内容和恶意档案。这项工作的目的是对影响恶意内容和配置文件成为病毒的因素进行深入表征。我们的研究结果显示,在波士顿危机期间,Twitter上最具病毒性的内容中有29%是谣言和虚假内容;51%是一般性意见和评论;其余的都是真实的信息。我们发现,大量拥有高社会声誉和认证账户的用户负责传播虚假内容。接下来,我们使用回归预测模型来验证,在给定时间内传播虚假内容的所有用户的总体影响,可以用来估计该内容在未来的增长。在波士顿事件期间,推特上创建了许多恶意账户,后来被推特暂停。我们确定了超过6000个这样的用户配置文件,我们观察到,在爆炸发生后,这样的配置文件的创建大幅增加。我们在这些悬空剖面之间的相互作用网络中确定了封闭的群落结构和恒星形成。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
$1.00 per RT #BostonMarathon #PrayForBoston: Analyzing fake content on Twitter
Online social media has emerged as one of the prominent channels for dissemination of information during real world events. Malicious content is posted online during events, which can result in damage, chaos and monetary losses in the real world. We analyzed one such media i.e. Twitter, for content generated during the event of Boston Marathon Blasts, that occurred on April, 15th, 2013. A lot of fake content and malicious profiles originated on Twitter network during this event. The aim of this work is to perform in-depth characterization of what factors influenced in malicious content and profiles becoming viral. Our results showed that 29% of the most viral content on Twitter, during the Boston crisis were rumors and fake content; while 51% was generic opinions and comments; and rest was true information.We found that large number of users with high social reputation and verified accounts were responsible for spreading the fake content. Next, we used regression prediction model, to verify that, overall impact of all users who propagate the fake content at a given time, can be used to estimate the growth of that content in future. Many malicious accounts were created on Twitter during the Boston event, that were later suspended by Twitter. We identified over six thousand such user profiles, we observed that the creation of such profiles surged considerably right after the blasts occurred. We identified closed community structure and star formation in the interaction network of these suspended profiles amongst themselves.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信