递归神经网络- fitnets:通过时间序列知识蒸馏改善学生成绩的早期预测

IF 4 2区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH
Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada
{"title":"递归神经网络- fitnets:通过时间序列知识蒸馏改善学生成绩的早期预测","authors":"Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada","doi":"10.1177/07356331221129765","DOIUrl":null,"url":null,"abstract":"This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.","PeriodicalId":47865,"journal":{"name":"Journal of Educational Computing Research","volume":"61 1","pages":"639 - 670"},"PeriodicalIF":4.0000,"publicationDate":"2022-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation\",\"authors\":\"Ryusuke Murata, Fumiya Okubo, T. Minematsu, Yuta Taniguchi, Atsushi Shimada\",\"doi\":\"10.1177/07356331221129765\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.\",\"PeriodicalId\":47865,\"journal\":{\"name\":\"Journal of Educational Computing Research\",\"volume\":\"61 1\",\"pages\":\"639 - 670\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2022-10-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Educational Computing Research\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1177/07356331221129765\",\"RegionNum\":2,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Computing Research","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1177/07356331221129765","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 2

摘要

本研究有助于提高RNN- fitnets对学生成绩的早期预测,该方法将知识蒸馏(KD)应用于递归神经网络(RNN)模型的时间序列方向。RNN- fitnets将KD中的教师模型替换为“具有长期时间序列的RNN模型,其中输入整个课程的特征”,将KD中的学生模型替换为“具有短期时间序列的RNN模型,其中仅输入早期阶段的特征”。因此,早期阶段的RNN模型被训练成与后期阶段更精确的RNN模型输出相同的结果。实验将RNN- fitnets与普通RNN模型在总共296名大学生的数据集上进行了比较。结果表明,RNN-FitNets可以提高早期预测。此外,利用SHAP值解释了输入特征对RNN-FitNets预测结果的贡献。结果表明,RNN-FitNets可以从课程的早期阶段就考虑到输入特征的未来影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation
This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Educational Computing Research
Journal of Educational Computing Research EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
11.90
自引率
6.20%
发文量
69
期刊介绍: The goal of this Journal is to provide an international scholarly publication forum for peer-reviewed interdisciplinary research into the applications, effects, and implications of computer-based education. The Journal features articles useful for practitioners and theorists alike. The terms "education" and "computing" are viewed broadly. “Education” refers to the use of computer-based technologies at all levels of the formal education system, business and industry, home-schooling, lifelong learning, and unintentional learning environments. “Computing” refers to all forms of computer applications and innovations - both hardware and software. For example, this could range from mobile and ubiquitous computing to immersive 3D simulations and games to computing-enhanced virtual learning environments.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信