Exploring Rater Accuracy Using Unfolding Models Combined with Topic Models: Incorporating Supervised Latent Dirichlet Allocation

IF 0.6 Q3 SOCIAL SCIENCES, INTERDISCIPLINARY
Jordan M. Wheeler, G. Engelhard, Jue Wang
{"title":"Exploring Rater Accuracy Using Unfolding Models Combined with Topic Models: Incorporating Supervised Latent Dirichlet Allocation","authors":"Jordan M. Wheeler, G. Engelhard, Jue Wang","doi":"10.1080/15366367.2021.1915094","DOIUrl":null,"url":null,"abstract":"ABSTRACT Objectively scoring constructed-response items on educational assessments has long been a challenge due to the use of human raters. Even well-trained raters using a rubric can inaccurately assess essays. Unfolding models measure rater’s scoring accuracy by capturing the discrepancy between criterion and operational ratings by placing essays on an unfolding continuum with an ideal-point location. Essay unfolding locations indicate how difficult it is for raters to score an essay accurately. This study aims to explore a substantive interpretation of the unfolding scale based on a supervised Latent Dirichlet Allocation (sLDA) model. We investigate the relationship between latent topics extracted using sLDA and unfolding locations with a sample of essays (n = 100) obtained from an integrated writing assessment. Results show that (a) three latent topics moderately explain (r 2 = 0.561) essay locations defined by the unfolding scale and (b) failing to use and/or cite the source articles led to essays that are difficult-to-score accurately.","PeriodicalId":46596,"journal":{"name":"Measurement-Interdisciplinary Research and Perspectives","volume":"12 1","pages":"34 - 46"},"PeriodicalIF":0.6000,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement-Interdisciplinary Research and Perspectives","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/15366367.2021.1915094","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
引用次数: 1

Abstract

ABSTRACT Objectively scoring constructed-response items on educational assessments has long been a challenge due to the use of human raters. Even well-trained raters using a rubric can inaccurately assess essays. Unfolding models measure rater’s scoring accuracy by capturing the discrepancy between criterion and operational ratings by placing essays on an unfolding continuum with an ideal-point location. Essay unfolding locations indicate how difficult it is for raters to score an essay accurately. This study aims to explore a substantive interpretation of the unfolding scale based on a supervised Latent Dirichlet Allocation (sLDA) model. We investigate the relationship between latent topics extracted using sLDA and unfolding locations with a sample of essays (n = 100) obtained from an integrated writing assessment. Results show that (a) three latent topics moderately explain (r 2 = 0.561) essay locations defined by the unfolding scale and (b) failing to use and/or cite the source articles led to essays that are difficult-to-score accurately.
利用展开模型结合主题模型探索更高的准确性:纳入监督潜在狄利克雷分配
长期以来,由于使用人工评分,对教育评估中的构建反应项目进行客观评分一直是一个挑战。即使是训练有素的评分员,使用评分标准也可能不准确地评估文章。展开模型通过将文章放置在具有理想点位置的展开连续体上来捕获标准和操作评级之间的差异,从而测量评分者的评分准确性。文章展开的位置表明评分者准确地评分一篇文章有多困难。本研究旨在探讨基于监督潜狄利克雷分配(sLDA)模型的展开量表的实质解释。我们通过综合写作评估获得的文章样本(n = 100)研究了使用sLDA提取的潜在主题与展开位置之间的关系。结果表明(a)三个潜在主题适度解释(r 2 = 0.561)展开量表定义的论文位置;(b)未使用和/或引用源文章导致论文难以准确评分。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Measurement-Interdisciplinary Research and Perspectives
Measurement-Interdisciplinary Research and Perspectives SOCIAL SCIENCES, INTERDISCIPLINARY-
CiteScore
1.80
自引率
0.00%
发文量
23
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信