Kaitlin Peach, Joe Ripberger, Kuhika Gupta, Andrew Fox, H. Jenkins‐Smith, Carol Silva
{"title":"Seeing lies and laying blame: Partisanship and U.S. public perceptions about disinformation","authors":"Kaitlin Peach, Joe Ripberger, Kuhika Gupta, Andrew Fox, H. Jenkins‐Smith, Carol Silva","doi":"10.37016/mr-2020-135","DOIUrl":"https://doi.org/10.37016/mr-2020-135","url":null,"abstract":"Using data from a nationally representative survey of 2,036 U.S. adults, we analyze partisan perceptions of the risk disinformation poses U.S. government and society, as well as the actors viewed as responsible for and harmed by disinformation. Our findings indicate relatively high concern about disinformation across a variety of societal issues, with broad bipartisan agreement that disinformation poses significant risks and causes harms to several groups. However, agreement ends there. Republicans and Democrats fundamentally disagree on who is responsible. We discuss the implications of this disagreement for understanding disinformation as a policy problem and the implications for policy solutions.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"285 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139836619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kaitlin Peach, Joe Ripberger, Kuhika Gupta, Andrew Fox, H. Jenkins‐Smith, Carol Silva
{"title":"Seeing lies and laying blame: Partisanship and U.S. public perceptions about disinformation","authors":"Kaitlin Peach, Joe Ripberger, Kuhika Gupta, Andrew Fox, H. Jenkins‐Smith, Carol Silva","doi":"10.37016/mr-2020-135","DOIUrl":"https://doi.org/10.37016/mr-2020-135","url":null,"abstract":"Using data from a nationally representative survey of 2,036 U.S. adults, we analyze partisan perceptions of the risk disinformation poses U.S. government and society, as well as the actors viewed as responsible for and harmed by disinformation. Our findings indicate relatively high concern about disinformation across a variety of societal issues, with broad bipartisan agreement that disinformation poses significant risks and causes harms to several groups. However, agreement ends there. Republicans and Democrats fundamentally disagree on who is responsible. We discuss the implications of this disagreement for understanding disinformation as a policy problem and the implications for policy solutions.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"54 4","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139776918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Measuring what matters: Investigating what new types of assessments reveal about students’ online source evaluations","authors":"Joel Breakstone, Sarah McGrew, Mark Smith","doi":"10.37016/mr-2020-133","DOIUrl":"https://doi.org/10.37016/mr-2020-133","url":null,"abstract":"A growing number of educational interventions have shown that students can learn the strategies fact checkers use to efficiently evaluate online information. Measuring the effectiveness of these interventions has required new approaches to assessment because extant measures reveal too little about the processes students use to evaluate live internet sources. In this paper, we analyze two types of assessments developed to meet the need for new measures. We describe what these assessments reveal about student thinking and how they provide practitioners, policymakers, and researchers options for measuring participants’ evaluative strategies.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"12 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139782752","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Measuring what matters: Investigating what new types of assessments reveal about students’ online source evaluations","authors":"Joel Breakstone, Sarah McGrew, Mark Smith","doi":"10.37016/mr-2020-133","DOIUrl":"https://doi.org/10.37016/mr-2020-133","url":null,"abstract":"A growing number of educational interventions have shown that students can learn the strategies fact checkers use to efficiently evaluate online information. Measuring the effectiveness of these interventions has required new approaches to assessment because extant measures reveal too little about the processes students use to evaluate live internet sources. In this paper, we analyze two types of assessments developed to meet the need for new measures. We describe what these assessments reveal about student thinking and how they provide practitioners, policymakers, and researchers options for measuring participants’ evaluative strategies.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"83 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139842677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
László Horváth, Daniel Stevens, Susan Banducci, Raluca Popp, Travis Coan
{"title":"Correcting campaign misinformation: Experimental evidence from a two-wave panel study","authors":"László Horváth, Daniel Stevens, Susan Banducci, Raluca Popp, Travis Coan","doi":"10.37016/mr-2020-132","DOIUrl":"https://doi.org/10.37016/mr-2020-132","url":null,"abstract":"In this study, we used a two-wave panel and a real-world intervention during the 2017 UK general election to investigate whether fact-checking can reduce beliefs in an incorrect campaign claim, source effects, the duration of source effects, and how predispositions including political orientations and prior exposure condition them. We find correction effects in the short term only, but across different political divisions and various prior exposure levels. We discuss the significance of independent fact-checking sources and the UK partisan press in facilitating effects.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"31 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139804191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
László Horváth, Daniel Stevens, Susan Banducci, Raluca Popp, Travis Coan
{"title":"Correcting campaign misinformation: Experimental evidence from a two-wave panel study","authors":"László Horváth, Daniel Stevens, Susan Banducci, Raluca Popp, Travis Coan","doi":"10.37016/mr-2020-132","DOIUrl":"https://doi.org/10.37016/mr-2020-132","url":null,"abstract":"In this study, we used a two-wave panel and a real-world intervention during the 2017 UK general election to investigate whether fact-checking can reduce beliefs in an incorrect campaign claim, source effects, the duration of source effects, and how predispositions including political orientations and prior exposure condition them. We find correction effects in the short term only, but across different political divisions and various prior exposure levels. We discuss the significance of independent fact-checking sources and the UK partisan press in facilitating effects.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"39 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139864169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Ronzani, Folco Panizza, Tiffany Morisseau, Simone Mattavelli, Carlo Martini
{"title":"How different incentives reduce scientific misinformation online","authors":"P. Ronzani, Folco Panizza, Tiffany Morisseau, Simone Mattavelli, Carlo Martini","doi":"10.37016/mr-2020-131","DOIUrl":"https://doi.org/10.37016/mr-2020-131","url":null,"abstract":"Several social media employ or consider user recruitment as defense against misinformation. Yet, it is unclear how to encourage users to make accurate evaluations. Our study shows that presenting the performance of previous participants increases discernment of science-related news. Making participants aware that their evaluations would be used by future participants had no effect on accuracy. Lastly, monetary rewards have the largest effect on accuracy. Our study provides support for the idea that a person’s motivation is an essential component of their propensity to be vigilant online and that it is possible to devise strategies to strengthen this motivation.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"51 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139596930","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gillian Murphy, Constance de Saint Laurent, Megan Reynolds, Omar Aftab, Karen Hegarty, Yuning Sun, C. Greene
{"title":"What do we study when we study misinformation? A scoping review of experimental research (2016-2022)","authors":"Gillian Murphy, Constance de Saint Laurent, Megan Reynolds, Omar Aftab, Karen Hegarty, Yuning Sun, C. Greene","doi":"10.37016/mr-2020-130","DOIUrl":"https://doi.org/10.37016/mr-2020-130","url":null,"abstract":"We reviewed 555 papers published from 2016–2022 that presented misinformation to participants. We identified several trends in the literature—increasing frequency of misinformation studies over time, a wide variety of topics covered, and a significant focus on COVID-19 misinformation since 2020. We also identified several important shortcomings, including overrepresentation of samples from the United States and Europe and excessive emphasis on short-term consequences of brief, text-based misinformation. Most studies examined belief in misinformation as the primary outcome. While many researchers identified behavioural consequences of misinformation exposure as a pressing concern, we observed a lack of research directly investigating behaviour change.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"5 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139275210","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Littrell, Casey A. Klofstad, A. Diekman, John Funchion, M. Murthi, K. Premaratne, Michelle I. Seelig, Daniel Verdear, Stefan Wuchty, J. Uscinski
{"title":"Who knowingly shares false political information online?","authors":"S. Littrell, Casey A. Klofstad, A. Diekman, John Funchion, M. Murthi, K. Premaratne, Michelle I. Seelig, Daniel Verdear, Stefan Wuchty, J. Uscinski","doi":"10.37016/mr-2020-121","DOIUrl":"https://doi.org/10.37016/mr-2020-121","url":null,"abstract":"Some people share misinformation accidentally, but others do so knowingly. To fully understand the spread of misinformation online, it is important to analyze those who purposely share it. Using a 2022 U.S. survey, we found that 14 percent of respondents reported knowingly sharing misinformation, and that these respondents were more likely to also report support for political violence, a desire to run for office, and warm feelings toward extremists. These respondents were also more likely to have elevated levels of a psychological need for chaos, dark tetrad traits, and paranoia. Our findings illuminate one vector through which misinformation is spread.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41801592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hansika Kapoor, S. Rezaei, Swanaya Gurjar, A. Tagat, Denny George, Yash Budhwar, Arathy Puthillam
{"title":"Does incentivization promote sharing “true” content online?","authors":"Hansika Kapoor, S. Rezaei, Swanaya Gurjar, A. Tagat, Denny George, Yash Budhwar, Arathy Puthillam","doi":"10.37016/mr-2020-120","DOIUrl":"https://doi.org/10.37016/mr-2020-120","url":null,"abstract":"In an online experiment in India, incentives for sharing factual posts increased sharing compared to no incentivization. However, the type of incentive (monetary or social) did not influence sharing behavior in a custom social media simulation. Curbing misinformation may not require substantial monetary resources; in fact, social media platforms can devise ways to socially incentivize their users for being responsible netizens who share true information. Results of the study point to the need for further analysis through real-world experiments on how effective social incentivization systems can be put in place.","PeriodicalId":93289,"journal":{"name":"Harvard Kennedy School misinformation review","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41492716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}