{"title":"Information source and content: articulating two key concepts for information evaluation","authors":"Iulian Vamanu, Elizabeth Zak","doi":"10.1108/ils-09-2021-0084","DOIUrl":null,"url":null,"abstract":"\nPurpose\nLearning how to identify and avoid inaccurate information, especially disinformation, is essential for any informational consumer. Many information literacy tools specify criteria that can help users evaluate information more efficiently and effectively. However, the authors of these tools do not always agree on which criteria should be emphasized, what they mean or why they should be included in the tool. This study aims to clarify two such criteria (source credibility and soundness of content), which evolutionary cognitive psychology research emphasize. This paper uses them as a basis for building a question-based evaluation tool and draws implications for information literacy programs.\n\n\nDesign/methodology/approach\nThis paper draws on cross-disciplinary scholarship (in library and information science, evolutionary cognitive psychology and rhetoric studies) to explore 15 approaches to information evaluation which conceptualizes source credibility and content soundness, two markers of information accuracy. This paper clarifies these two concepts, builds two sets of questions meant to elicit empirical indicators of information accuracy and deploys them against a recent piece of journalism which embeds a conspiracy theory about the origins of the COVID-19 pandemic. This paper shows how the two standards can help us determine that the article is misleading. This paper draws implications for information literacy programs.\n\n\nFindings\nThe meanings of and relationships between source credibility and content soundness often diverge across the 15 approaches to information evaluation this paper analyzed. Conceptual analysis allowed the authors to articulate source credibility in terms of authority and trustworthiness, and content soundness in terms of plausibility and evidential support. These conceptualizations allow the authors to formulate two respective sets of appropriate questions, the answers to which are meant to function as empirical indicators for the two standards. Deploying this instrument provides us with the opportunity to understand why a certain article discussing COVID-19 is misleading.\n\n\nOriginality/value\nBy articulating source credibility and content soundness as the two key criteria for evaluating information, together with guiding questions meant to elicit empirical indicators for them, this paper streamlines the process through which information users can judge the likelihood that a piece of information they encounter is accurate.\n","PeriodicalId":44588,"journal":{"name":"Information and Learning Sciences","volume":"30 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2022-02-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information and Learning Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/ils-09-2021-0084","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
引用次数: 1
Abstract
Purpose
Learning how to identify and avoid inaccurate information, especially disinformation, is essential for any informational consumer. Many information literacy tools specify criteria that can help users evaluate information more efficiently and effectively. However, the authors of these tools do not always agree on which criteria should be emphasized, what they mean or why they should be included in the tool. This study aims to clarify two such criteria (source credibility and soundness of content), which evolutionary cognitive psychology research emphasize. This paper uses them as a basis for building a question-based evaluation tool and draws implications for information literacy programs.
Design/methodology/approach
This paper draws on cross-disciplinary scholarship (in library and information science, evolutionary cognitive psychology and rhetoric studies) to explore 15 approaches to information evaluation which conceptualizes source credibility and content soundness, two markers of information accuracy. This paper clarifies these two concepts, builds two sets of questions meant to elicit empirical indicators of information accuracy and deploys them against a recent piece of journalism which embeds a conspiracy theory about the origins of the COVID-19 pandemic. This paper shows how the two standards can help us determine that the article is misleading. This paper draws implications for information literacy programs.
Findings
The meanings of and relationships between source credibility and content soundness often diverge across the 15 approaches to information evaluation this paper analyzed. Conceptual analysis allowed the authors to articulate source credibility in terms of authority and trustworthiness, and content soundness in terms of plausibility and evidential support. These conceptualizations allow the authors to formulate two respective sets of appropriate questions, the answers to which are meant to function as empirical indicators for the two standards. Deploying this instrument provides us with the opportunity to understand why a certain article discussing COVID-19 is misleading.
Originality/value
By articulating source credibility and content soundness as the two key criteria for evaluating information, together with guiding questions meant to elicit empirical indicators for them, this paper streamlines the process through which information users can judge the likelihood that a piece of information they encounter is accurate.
期刊介绍:
Information and Learning Sciences advances inter-disciplinary research that explores scholarly intersections shared within 2 key fields: information science and the learning sciences / education sciences. The journal provides a publication venue for work that strengthens our scholarly understanding of human inquiry and learning phenomena, especially as they relate to design and uses of information and e-learning systems innovations.