Testing crowdsourcing as a means of recruitment for the comparative judgement of L2 argumentative essays

IF 5 1区 文学 Q1 LINGUISTICS
Peter Thwaites , Magali Paquot
{"title":"Testing crowdsourcing as a means of recruitment for the comparative judgement of L2 argumentative essays","authors":"Peter Thwaites ,&nbsp;Magali Paquot","doi":"10.1016/j.jslw.2025.101207","DOIUrl":null,"url":null,"abstract":"<div><div>Comparative judgement (CJ) is an assessment method in which a large number of pairwise comparisons between learner productions are used to generate scales ranking each item from strongest to weakest. Recent research has suggested that combining CJ with various approaches to judge recruitment, including community-driven and crowdsourcing methods, holds promise as a method of assessing L2 writing, especially for research purposes. However, the majority of studies to date have tested CJ only using relatively simple, easily evaluated sets of texts. There remains insufficient evidence of the method’s potential for assessing more complex texts, particularly when the comparisons are being conducted by crowdsourced assessors. This study seeks to address this problem by testing the reliability and validity of a crowdsourced form of CJ for the assessment of texts which are longer, more topically diverse, and more homogeneous in proficiency than those used in earlier studies. The results suggest that CJ can be conducted with crowdsourced judges to generate reliable assessments of L2 writing, and provide initial evidence of concurrent validity. However, there are drawbacks in terms of efficiency.</div></div>","PeriodicalId":47934,"journal":{"name":"Journal of Second Language Writing","volume":"68 ","pages":"Article 101207"},"PeriodicalIF":5.0000,"publicationDate":"2025-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Second Language Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1060374325000323","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"LINGUISTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Comparative judgement (CJ) is an assessment method in which a large number of pairwise comparisons between learner productions are used to generate scales ranking each item from strongest to weakest. Recent research has suggested that combining CJ with various approaches to judge recruitment, including community-driven and crowdsourcing methods, holds promise as a method of assessing L2 writing, especially for research purposes. However, the majority of studies to date have tested CJ only using relatively simple, easily evaluated sets of texts. There remains insufficient evidence of the method’s potential for assessing more complex texts, particularly when the comparisons are being conducted by crowdsourced assessors. This study seeks to address this problem by testing the reliability and validity of a crowdsourced form of CJ for the assessment of texts which are longer, more topically diverse, and more homogeneous in proficiency than those used in earlier studies. The results suggest that CJ can be conducted with crowdsourced judges to generate reliable assessments of L2 writing, and provide initial evidence of concurrent validity. However, there are drawbacks in terms of efficiency.
测试众包作为第二语言议论文比较判断的招募手段
比较判断(CJ)是一种评估方法,它通过对学习者的学习成果进行大量的两两比较来生成量表,将每个项目从强到弱进行排序。最近的研究表明,将CJ与各种评估招聘的方法(包括社区驱动和众包方法)相结合,有望成为评估第二语言写作的方法,特别是出于研究目的。然而,到目前为止,大多数研究只使用相对简单,易于评估的文本集来测试CJ。目前仍没有足够的证据表明该方法有潜力评估更复杂的文本,特别是当比较是由众包评估人员进行时。本研究试图通过测试一种众包形式的CJ的可靠性和有效性来解决这个问题,以评估比早期研究中使用的文本更长、主题更多样化、熟练程度更均匀的文本。结果表明,CJ可以与众包评委一起进行,以产生可靠的第二语言写作评估,并提供并发效度的初步证据。然而,在效率方面也存在缺陷。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
8.80
自引率
13.10%
发文量
50
审稿时长
59 days
期刊介绍: The Journal of Second Language Writing is devoted to publishing theoretically grounded reports of research and discussions that represent a significant contribution to current understandings of central issues in second and foreign language writing and writing instruction. Some areas of interest are personal characteristics and attitudes of L2 writers, L2 writers'' composing processes, features of L2 writers'' texts, readers'' responses to L2 writing, assessment/evaluation of L2 writing, contexts (cultural, social, political, institutional) for L2 writing, and any other topic clearly relevant to L2 writing theory, research, or instruction.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信