Peer Assessment Rubric Analyzer: An NLP approach to analyzing rubric items for better peer-review

M. Parvez Rashid, E. Gehringer, Mitchell Young, Divyang Doshi, Qinjin Jia, Yunkai Xiao
{"title":"Peer Assessment Rubric Analyzer: An NLP approach to analyzing rubric items for better peer-review","authors":"M. Parvez Rashid, E. Gehringer, Mitchell Young, Divyang Doshi, Qinjin Jia, Yunkai Xiao","doi":"10.1109/ITHET50392.2021.9759679","DOIUrl":null,"url":null,"abstract":"Rubrics have long been used to provide a grading process that is fair and adherent to standards. Just as rubrics can help instructors assess a piece of work, they can also help students to do a more effective job of peer assessment. In a peer-review environment, reviewers provide formative feedback following the rubric criteria. High-quality feedback can greatly enhance the learning process. Rubric criteria need to be worded carefully to provide clear instruction and effective guidance. Heretofore, little research has been performed on how rubric text affects rubric feedback. This study focuses on analyzing rubric text to identify whether rubric criteria will induce peer-reviewers to write quality reviews. We have analyzed 408,104 formative feedback comments based on 3,164 rubric criteria using natural language processing techniques with advanced neural network methods. To our knowledge, this is the first attempt to analyze rubric text to improve review comments for the peer-review environment.","PeriodicalId":339339,"journal":{"name":"2021 19th International Conference on Information Technology Based Higher Education and Training (ITHET)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 19th International Conference on Information Technology Based Higher Education and Training (ITHET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITHET50392.2021.9759679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Rubrics have long been used to provide a grading process that is fair and adherent to standards. Just as rubrics can help instructors assess a piece of work, they can also help students to do a more effective job of peer assessment. In a peer-review environment, reviewers provide formative feedback following the rubric criteria. High-quality feedback can greatly enhance the learning process. Rubric criteria need to be worded carefully to provide clear instruction and effective guidance. Heretofore, little research has been performed on how rubric text affects rubric feedback. This study focuses on analyzing rubric text to identify whether rubric criteria will induce peer-reviewers to write quality reviews. We have analyzed 408,104 formative feedback comments based on 3,164 rubric criteria using natural language processing techniques with advanced neural network methods. To our knowledge, this is the first attempt to analyze rubric text to improve review comments for the peer-review environment.
同行评估标准分析器:一种NLP方法来分析标准项目,以便更好地进行同行评审
长期以来,标准被用来提供一个公平和符合标准的分级过程。就像标准可以帮助教师评估一项工作一样,它们也可以帮助学生更有效地进行同伴评估。在同行评审环境中,审稿人根据标题标准提供形成性反馈。高质量的反馈可以大大提高学习过程。规则标准的措辞需要谨慎,以提供明确的指导和有效的指导。迄今为止,关于标题文本如何影响标题反馈的研究很少。本研究的重点是分析标题文本,以确定标题标准是否会促使同行审稿人撰写高质量的审稿。我们使用自然语言处理技术和先进的神经网络方法,分析了基于3164个标题标准的408,104条形成性反馈意见。据我们所知,这是第一次尝试分析标题文本以改进同行评审环境的评审意见。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信