Adding propedeuticy dependencies to the OpenAnswer Bayesian model of peer-assessment

M. De Marsico, A. Sterbini, M. Temperini
{"title":"Adding propedeuticy dependencies to the OpenAnswer Bayesian model of peer-assessment","authors":"M. De Marsico, A. Sterbini, M. Temperini","doi":"10.1109/ITHET.2014.7155692","DOIUrl":null,"url":null,"abstract":"Peer-assessment can be used to evaluate the knowledge level achieved by learners, while exposing them to a significant leaning activity at the same time. Here we see an approach to semi-automatic grading of school works. It is based on peer-assessment of answers to open ended questions (“open answers”), supported by the teacher grading activity performed on a portion of the answers. The methodology we present is based on a representation of student model and answers through Bayesian networks. It supports grading in a six-values scale (the widely used “A” to “F” scale). The experiments we present test the possibility to model the fact that knowledge required to perform some work is preparatory to that required for a subsequent one. The experiments have been conducted using the web-based system OpenAnswer and were meant to collect datasets, to be exploited in order to evaluate the various algorithms, settings and termination conditions better to be applied in order to have a reliable set of grading out of the learners' peer-assessment process and the teacher's grading work (with the latter limited to a significantly limited percentage of the answers to be graded).","PeriodicalId":432693,"journal":{"name":"2014 Information Technology Based Higher Education and Training (ITHET)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 Information Technology Based Higher Education and Training (ITHET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITHET.2014.7155692","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Peer-assessment can be used to evaluate the knowledge level achieved by learners, while exposing them to a significant leaning activity at the same time. Here we see an approach to semi-automatic grading of school works. It is based on peer-assessment of answers to open ended questions (“open answers”), supported by the teacher grading activity performed on a portion of the answers. The methodology we present is based on a representation of student model and answers through Bayesian networks. It supports grading in a six-values scale (the widely used “A” to “F” scale). The experiments we present test the possibility to model the fact that knowledge required to perform some work is preparatory to that required for a subsequent one. The experiments have been conducted using the web-based system OpenAnswer and were meant to collect datasets, to be exploited in order to evaluate the various algorithms, settings and termination conditions better to be applied in order to have a reliable set of grading out of the learners' peer-assessment process and the teacher's grading work (with the latter limited to a significantly limited percentage of the answers to be graded).
向OpenAnswer的同行评估贝叶斯模型添加属性依赖关系
同侪评估可以用来评估学习者所获得的知识水平,同时让他们接触到一个重要的学习活动。这里我们看到了一种对学校作业进行半自动评分的方法。它基于对开放式问题(“开放式答案”)答案的同行评估,并由教师对部分答案进行评分活动。我们提出的方法是基于学生模型的表示和通过贝叶斯网络的答案。它支持六值等级(广泛使用的“a”到“F”等级)。我们提出的实验测试了这样一种可能性,即完成某项工作所需的知识是为后续工作所需的知识做准备。实验是使用基于网络的系统OpenAnswer进行的,旨在收集数据集,以便更好地评估各种算法、设置和终止条件,以便在学习者的同行评估过程和教师的评分工作中获得一套可靠的评分(后者仅限于要评分的答案的非常有限的百分比)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信