Read-Agree-Predict: A Crowdsourced Approach to Discovering Relevant Primary Sources for Historians

Nai-Ching Wang, D. Hicks, P. Quigley, Kurt Luther
{"title":"Read-Agree-Predict: A Crowdsourced Approach to Discovering Relevant Primary Sources for Historians","authors":"Nai-Ching Wang, D. Hicks, P. Quigley, Kurt Luther","doi":"10.15346/hc.v6i1.8","DOIUrl":null,"url":null,"abstract":"Historians spend significant time evaluating the relevance of primary sources that they encounter in digitized archives and through web searches. One reason this task is time-consuming is that historians’ research interests are often highly abstract and specialized. These topics are unlikely to be manually indexed and are difficult to identify with automated text analysis techniques. In this article, we investigate the potential of a new crowdsourcing model in which the historian delegates to a novice crowd the task of evaluating the relevance of primary sources with respect to her unique research interests. The model employs a novel crowd workflow, Read-AgreePredict (RAP), that allows novice crowd workers to perform as well as expert historians. As a useful byproduct, RAP also reveals and prioritizes crowd confusions as targeted learning opportunities. We demonstrate the value of our model with two experiments with paid crowd workers (n=170), with the future goal of extending our work to classroom students and public history interventions. We also discuss broader implications for historical research and education.","PeriodicalId":92785,"journal":{"name":"Human computation (Fairfax, Va.)","volume":"421 1","pages":"147-175"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human computation (Fairfax, Va.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15346/hc.v6i1.8","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Historians spend significant time evaluating the relevance of primary sources that they encounter in digitized archives and through web searches. One reason this task is time-consuming is that historians’ research interests are often highly abstract and specialized. These topics are unlikely to be manually indexed and are difficult to identify with automated text analysis techniques. In this article, we investigate the potential of a new crowdsourcing model in which the historian delegates to a novice crowd the task of evaluating the relevance of primary sources with respect to her unique research interests. The model employs a novel crowd workflow, Read-AgreePredict (RAP), that allows novice crowd workers to perform as well as expert historians. As a useful byproduct, RAP also reveals and prioritizes crowd confusions as targeted learning opportunities. We demonstrate the value of our model with two experiments with paid crowd workers (n=170), with the future goal of extending our work to classroom students and public history interventions. We also discuss broader implications for historical research and education.
阅读-同意-预测:为历史学家发现相关第一手资料的众包方法
历史学家花费大量时间评估他们在数字化档案和网络搜索中遇到的第一手资料的相关性。这项工作耗时的一个原因是,历史学家的研究兴趣往往是高度抽象和专业化的。这些主题不太可能被手工索引,而且很难用自动文本分析技术来识别。在本文中,我们研究了一种新的众包模式的潜力,在这种模式中,历史学家将评估原始资料与她独特的研究兴趣的相关性的任务委托给新手群体。该模型采用了一种新颖的群体工作流程,即Read-AgreePredict (RAP),它允许新手群体工作者像专家历史学家一样出色地工作。作为一个有用的副产品,RAP还揭示并优先考虑人群的困惑,作为有针对性的学习机会。我们通过两个有偿人群工作者(n=170)的实验证明了我们模型的价值,未来的目标是将我们的工作扩展到课堂学生和公共历史干预。我们还讨论了对历史研究和教育的更广泛的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信