An Item Response Theory Approach to Enhance Peer Assessment Effectiveness in Massive Open Online Courses

IF 3.3 Q1 EDUCATION & EDUCATIONAL RESEARCH
M. Nakayama, F. Sciarrone, M. Temperini, Masaki Uto
{"title":"An Item Response Theory Approach to Enhance Peer Assessment Effectiveness in Massive Open Online Courses","authors":"M. Nakayama, F. Sciarrone, M. Temperini, Masaki Uto","doi":"10.4018/ijdet.313639","DOIUrl":null,"url":null,"abstract":"Massive open on-line courses (MOOCs) are effective and flexible resources to educate, train, and empower populations. Peer assessment (PA) provides a powerful pedagogical strategy to support educational activities and foster learners' success, also where a huge number of learners is involved. Item response theory (IRT) can model students' features, such as the skill to accomplish a task, and the capability to mark tasks. In this paper the authors investigate the applicability of IRT models to PA, in the learning environments of MOOCs. The main goal is to evaluate the relationships between some students' IRT parameters (ability, strictness) and some PA parameters (number of graders per task, and rating scale). The authors use a data-set simulating a large class (1,000 peers), built by a Gaussian distribution of the students' skill, to accomplish a task. The IRT analysis of the PA data allow to say that the best estimate for peers' ability is when 15 raters per task are used, with a [1,10] rating scale.","PeriodicalId":44463,"journal":{"name":"International Journal of Distance Education Technologies","volume":" ","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Distance Education Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/ijdet.313639","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

Massive open on-line courses (MOOCs) are effective and flexible resources to educate, train, and empower populations. Peer assessment (PA) provides a powerful pedagogical strategy to support educational activities and foster learners' success, also where a huge number of learners is involved. Item response theory (IRT) can model students' features, such as the skill to accomplish a task, and the capability to mark tasks. In this paper the authors investigate the applicability of IRT models to PA, in the learning environments of MOOCs. The main goal is to evaluate the relationships between some students' IRT parameters (ability, strictness) and some PA parameters (number of graders per task, and rating scale). The authors use a data-set simulating a large class (1,000 peers), built by a Gaussian distribution of the students' skill, to accomplish a task. The IRT analysis of the PA data allow to say that the best estimate for peers' ability is when 15 raters per task are used, with a [1,10] rating scale.
基于项目反应理论的大规模在线开放课程同伴评估有效性研究
大规模的在线开放课程(MOOC)是教育、培训和增强民众能力的有效而灵活的资源。同伴评估(PA)提供了一种强大的教学策略来支持教育活动并促进学习者的成功,同时也涉及到大量的学习者。项目反应理论(IRT)可以模拟学生的特征,如完成任务的技能和标记任务的能力。在本文中,作者研究了在MOOC的学习环境中,IRT模型对PA的适用性。主要目标是评估一些学生的IRT参数(能力、严格性)和一些PA参数(每个任务的评分人数和评分量表)之间的关系。作者使用模拟一个大班(1000名同学)的数据集来完成一项任务,该数据集是由学生技能的高斯分布建立的。PA数据的IRT分析表明,对同伴能力的最佳估计是当每个任务使用15名评分者时,评分量表为[1,10]。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.10
自引率
0.00%
发文量
14
期刊介绍: Discussions of computational methods, algorithms, implemented prototype systems, and applications of open and distance learning are the focuses of this publication. Practical experiences and surveys of using distance learning systems are also welcome. Distance education technologies published in IJDET will be divided into three categories, communication technologies, intelligent technologies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信