M. Parvez Rashid, E. Gehringer, Mitchell Young, Divyang Doshi, Qinjin Jia, Yunkai Xiao
{"title":"Peer Assessment Rubric Analyzer: An NLP approach to analyzing rubric items for better peer-review","authors":"M. Parvez Rashid, E. Gehringer, Mitchell Young, Divyang Doshi, Qinjin Jia, Yunkai Xiao","doi":"10.1109/ITHET50392.2021.9759679","DOIUrl":null,"url":null,"abstract":"Rubrics have long been used to provide a grading process that is fair and adherent to standards. Just as rubrics can help instructors assess a piece of work, they can also help students to do a more effective job of peer assessment. In a peer-review environment, reviewers provide formative feedback following the rubric criteria. High-quality feedback can greatly enhance the learning process. Rubric criteria need to be worded carefully to provide clear instruction and effective guidance. Heretofore, little research has been performed on how rubric text affects rubric feedback. This study focuses on analyzing rubric text to identify whether rubric criteria will induce peer-reviewers to write quality reviews. We have analyzed 408,104 formative feedback comments based on 3,164 rubric criteria using natural language processing techniques with advanced neural network methods. To our knowledge, this is the first attempt to analyze rubric text to improve review comments for the peer-review environment.","PeriodicalId":339339,"journal":{"name":"2021 19th International Conference on Information Technology Based Higher Education and Training (ITHET)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 19th International Conference on Information Technology Based Higher Education and Training (ITHET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITHET50392.2021.9759679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Rubrics have long been used to provide a grading process that is fair and adherent to standards. Just as rubrics can help instructors assess a piece of work, they can also help students to do a more effective job of peer assessment. In a peer-review environment, reviewers provide formative feedback following the rubric criteria. High-quality feedback can greatly enhance the learning process. Rubric criteria need to be worded carefully to provide clear instruction and effective guidance. Heretofore, little research has been performed on how rubric text affects rubric feedback. This study focuses on analyzing rubric text to identify whether rubric criteria will induce peer-reviewers to write quality reviews. We have analyzed 408,104 formative feedback comments based on 3,164 rubric criteria using natural language processing techniques with advanced neural network methods. To our knowledge, this is the first attempt to analyze rubric text to improve review comments for the peer-review environment.