Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada
{"title":"递归神经网络- fitnets:通过时间序列知识蒸馏改善学生成绩的早期预测","authors":"Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada","doi":"10.1177/07356331221129765","DOIUrl":null,"url":null,"abstract":"This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.","PeriodicalId":47865,"journal":{"name":"Journal of Educational Computing Research","volume":"61 1","pages":"639 - 670"},"PeriodicalIF":4.0000,"publicationDate":"2022-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation\",\"authors\":\"Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada\",\"doi\":\"10.1177/07356331221129765\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.\",\"PeriodicalId\":47865,\"journal\":{\"name\":\"Journal of Educational Computing Research\",\"volume\":\"61 1\",\"pages\":\"639 - 670\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2022-10-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Educational Computing Research\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1177/07356331221129765\",\"RegionNum\":2,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Computing Research","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1177/07356331221129765","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation
This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.
期刊介绍:
The goal of this Journal is to provide an international scholarly publication forum for peer-reviewed interdisciplinary research into the applications, effects, and implications of computer-based education. The Journal features articles useful for practitioners and theorists alike. The terms "education" and "computing" are viewed broadly. “Education” refers to the use of computer-based technologies at all levels of the formal education system, business and industry, home-schooling, lifelong learning, and unintentional learning environments. “Computing” refers to all forms of computer applications and innovations - both hardware and software. For example, this could range from mobile and ubiquitous computing to immersive 3D simulations and games to computing-enhanced virtual learning environments.