{"title":"Anomaly detection in the course evaluation process: a learning analytics–based approach","authors":"Anagha Vaidya, Saurabh Sharma","doi":"10.1108/itse-09-2022-0124","DOIUrl":null,"url":null,"abstract":"\nPurpose\nCourse evaluations are formative and are used to evaluate learnings of the students for a course. Anomalies in the evaluation process can lead to a faulty educational outcome. Learning analytics and educational data mining provide a set of techniques that can be conveniently applied to extensive data collected as part of the evaluation process to ensure remedial actions. This study aims to conduct an experimental research to detect anomalies in the evaluation methods.\n\n\nDesign/methodology/approach\nExperimental research is conducted with scientific approach and design. The researchers categorized anomaly into three categories, namely, an anomaly in criteria assessment, subject anomaly and anomaly in subject marks allocation. The different anomaly detection algorithms are used to educate data through the software R, and the results are summarized in the tables.\n\n\nFindings\nThe data points occurring in all algorithms are finally detected as an anomaly. The anomaly identifies the data points that deviate from the data set’s normal behavior. The subject which is consistently identified as anomalous by the different techniques is marked as an anomaly in evaluation. After identification, one can drill down to more details into the title of anomalies in the evaluation criteria.\n\n\nOriginality/value\nThis paper proposes an analytical model for the course evaluation process and demonstrates the use of actionable analytics to detect anomalies in the evaluation process.\n","PeriodicalId":44954,"journal":{"name":"Interactive Technology and Smart Education","volume":" ","pages":""},"PeriodicalIF":3.5000,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Interactive Technology and Smart Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/itse-09-2022-0124","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 1
Abstract
Purpose
Course evaluations are formative and are used to evaluate learnings of the students for a course. Anomalies in the evaluation process can lead to a faulty educational outcome. Learning analytics and educational data mining provide a set of techniques that can be conveniently applied to extensive data collected as part of the evaluation process to ensure remedial actions. This study aims to conduct an experimental research to detect anomalies in the evaluation methods.
Design/methodology/approach
Experimental research is conducted with scientific approach and design. The researchers categorized anomaly into three categories, namely, an anomaly in criteria assessment, subject anomaly and anomaly in subject marks allocation. The different anomaly detection algorithms are used to educate data through the software R, and the results are summarized in the tables.
Findings
The data points occurring in all algorithms are finally detected as an anomaly. The anomaly identifies the data points that deviate from the data set’s normal behavior. The subject which is consistently identified as anomalous by the different techniques is marked as an anomaly in evaluation. After identification, one can drill down to more details into the title of anomalies in the evaluation criteria.
Originality/value
This paper proposes an analytical model for the course evaluation process and demonstrates the use of actionable analytics to detect anomalies in the evaluation process.
期刊介绍:
Interactive Technology and Smart Education (ITSE) is a multi-disciplinary, peer-reviewed journal, which provides a distinct forum to specially promote innovation and participative research approaches. The following terms are defined, as used in the context of this journal: -Interactive Technology refers to all forms of digital technology, as described above, emphasizing innovation and human-/user-centred approaches. -Smart Education "SMART" is used as an acronym that refers to interactive technology that offers a more flexible and tailored approach to meet diverse individual requirements by being “Sensitive, Manageable, Adaptable, Responsive and Timely” to educators’ pedagogical strategies and learners’ educational and social needs’. -Articles are invited that explore innovative use of educational technologies that advance interactive technology in general and its applications in education in particular. The journal aims to bridge gaps in the field by promoting design research, action research, and continuous evaluation as an integral part of the development cycle of usable solutions/systems.