{"title":"贝叶斯有序同伴分级","authors":"Karthik Raman, T. Joachims","doi":"10.1145/2724660.2724678","DOIUrl":null,"url":null,"abstract":"Massive Online Open Courses have become an accessible and affordable choice for education. This has led to new technical challenges for instructors such as student evaluation at scale. Recent work has found ordinal peer grading}, where individual grader orderings are aggregated into an overall ordering of assignments, to be a viable alternate to traditional instructor/staff evaluation [23]. Existing techniques, which extend rank-aggregation methods, produce a single ordering as output. While these rankings have been found to be an accurate reflection of assignment quality on average, they do not communicate any of the uncertainty inherent in the assessment process. In particular, they do not to provide instructors with an estimate of the uncertainty of each assignment's position in the ranking. In this work, we tackle this problem by applying Bayesian techniques to the ordinal peer grading problem, using MCMC-based sampling techniques in conjunction with the Mallows model. Experiments are performed on real-world peer grading datasets, which demonstrate that the proposed method provides accurate uncertainty information via the estimated posterior distributions.","PeriodicalId":20664,"journal":{"name":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"34","resultStr":"{\"title\":\"Bayesian Ordinal Peer Grading\",\"authors\":\"Karthik Raman, T. Joachims\",\"doi\":\"10.1145/2724660.2724678\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Massive Online Open Courses have become an accessible and affordable choice for education. This has led to new technical challenges for instructors such as student evaluation at scale. Recent work has found ordinal peer grading}, where individual grader orderings are aggregated into an overall ordering of assignments, to be a viable alternate to traditional instructor/staff evaluation [23]. Existing techniques, which extend rank-aggregation methods, produce a single ordering as output. While these rankings have been found to be an accurate reflection of assignment quality on average, they do not communicate any of the uncertainty inherent in the assessment process. In particular, they do not to provide instructors with an estimate of the uncertainty of each assignment's position in the ranking. In this work, we tackle this problem by applying Bayesian techniques to the ordinal peer grading problem, using MCMC-based sampling techniques in conjunction with the Mallows model. Experiments are performed on real-world peer grading datasets, which demonstrate that the proposed method provides accurate uncertainty information via the estimated posterior distributions.\",\"PeriodicalId\":20664,\"journal\":{\"name\":\"Proceedings of the Second (2015) ACM Conference on Learning @ Scale\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"34\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Second (2015) ACM Conference on Learning @ Scale\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2724660.2724678\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2724660.2724678","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Massive Online Open Courses have become an accessible and affordable choice for education. This has led to new technical challenges for instructors such as student evaluation at scale. Recent work has found ordinal peer grading}, where individual grader orderings are aggregated into an overall ordering of assignments, to be a viable alternate to traditional instructor/staff evaluation [23]. Existing techniques, which extend rank-aggregation methods, produce a single ordering as output. While these rankings have been found to be an accurate reflection of assignment quality on average, they do not communicate any of the uncertainty inherent in the assessment process. In particular, they do not to provide instructors with an estimate of the uncertainty of each assignment's position in the ranking. In this work, we tackle this problem by applying Bayesian techniques to the ordinal peer grading problem, using MCMC-based sampling techniques in conjunction with the Mallows model. Experiments are performed on real-world peer grading datasets, which demonstrate that the proposed method provides accurate uncertainty information via the estimated posterior distributions.