Anomaly detection in the course evaluation process: a learning analytics–based approach

IF 3.5 Q1 EDUCATION & EDUCATIONAL RESEARCH
Anagha Vaidya, Saurabh Sharma
{"title":"Anomaly detection in the course evaluation process: a learning analytics–based approach","authors":"Anagha Vaidya, Saurabh Sharma","doi":"10.1108/itse-09-2022-0124","DOIUrl":null,"url":null,"abstract":"\nPurpose\nCourse evaluations are formative and are used to evaluate learnings of the students for a course. Anomalies in the evaluation process can lead to a faulty educational outcome. Learning analytics and educational data mining provide a set of techniques that can be conveniently applied to extensive data collected as part of the evaluation process to ensure remedial actions. This study aims to conduct an experimental research to detect anomalies in the evaluation methods.\n\n\nDesign/methodology/approach\nExperimental research is conducted with scientific approach and design. The researchers categorized anomaly into three categories, namely, an anomaly in criteria assessment, subject anomaly and anomaly in subject marks allocation. The different anomaly detection algorithms are used to educate data through the software R, and the results are summarized in the tables.\n\n\nFindings\nThe data points occurring in all algorithms are finally detected as an anomaly. The anomaly identifies the data points that deviate from the data set’s normal behavior. The subject which is consistently identified as anomalous by the different techniques is marked as an anomaly in evaluation. After identification, one can drill down to more details into the title of anomalies in the evaluation criteria.\n\n\nOriginality/value\nThis paper proposes an analytical model for the course evaluation process and demonstrates the use of actionable analytics to detect anomalies in the evaluation process.\n","PeriodicalId":44954,"journal":{"name":"Interactive Technology and Smart Education","volume":null,"pages":null},"PeriodicalIF":3.5000,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Interactive Technology and Smart Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/itse-09-2022-0124","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 1

Abstract

Purpose Course evaluations are formative and are used to evaluate learnings of the students for a course. Anomalies in the evaluation process can lead to a faulty educational outcome. Learning analytics and educational data mining provide a set of techniques that can be conveniently applied to extensive data collected as part of the evaluation process to ensure remedial actions. This study aims to conduct an experimental research to detect anomalies in the evaluation methods. Design/methodology/approach Experimental research is conducted with scientific approach and design. The researchers categorized anomaly into three categories, namely, an anomaly in criteria assessment, subject anomaly and anomaly in subject marks allocation. The different anomaly detection algorithms are used to educate data through the software R, and the results are summarized in the tables. Findings The data points occurring in all algorithms are finally detected as an anomaly. The anomaly identifies the data points that deviate from the data set’s normal behavior. The subject which is consistently identified as anomalous by the different techniques is marked as an anomaly in evaluation. After identification, one can drill down to more details into the title of anomalies in the evaluation criteria. Originality/value This paper proposes an analytical model for the course evaluation process and demonstrates the use of actionable analytics to detect anomalies in the evaluation process.
课程评估过程中的异常检测:一种基于学习分析的方法
目的课程评估是形成性的,用于评估学生对某门课程的学习情况。评估过程中的异常情况可能导致错误的教育结果。学习分析和教育数据挖掘提供了一套技术,可以方便地应用于作为评估过程一部分收集的大量数据,以确保采取补救措施。本研究旨在对检测异常的评价方法进行实验研究。设计/方法论/方法实验研究是用科学的方法和设计进行的。研究人员将异常分为三类,即标准评估异常、受试者异常和受试者分数分配异常。通过软件R,使用不同的异常检测算法来教育数据,结果总结在表中。Findings所有算法中出现的数据点最终都被检测为异常。异常标识偏离数据集正常行为的数据点。通过不同的技术一致确定为异常的受试者在评估中被标记为异常。识别后,可以深入到评估标准中异常的标题中的更多细节。原创性/价值本文提出了一个课程评估过程的分析模型,并演示了使用可操作的分析来检测评估过程中的异常情况。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Interactive Technology and Smart Education
Interactive Technology and Smart Education EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
12.00
自引率
2.30%
发文量
30
期刊介绍: Interactive Technology and Smart Education (ITSE) is a multi-disciplinary, peer-reviewed journal, which provides a distinct forum to specially promote innovation and participative research approaches. The following terms are defined, as used in the context of this journal: -Interactive Technology refers to all forms of digital technology, as described above, emphasizing innovation and human-/user-centred approaches. -Smart Education "SMART" is used as an acronym that refers to interactive technology that offers a more flexible and tailored approach to meet diverse individual requirements by being “Sensitive, Manageable, Adaptable, Responsive and Timely” to educators’ pedagogical strategies and learners’ educational and social needs’. -Articles are invited that explore innovative use of educational technologies that advance interactive technology in general and its applications in education in particular. The journal aims to bridge gaps in the field by promoting design research, action research, and continuous evaluation as an integral part of the development cycle of usable solutions/systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信