Knowledge tracking model based on recurrent neural network and transformer

Yan Cheng, Songhua Zhao, Jiansheng Hu, Haifeng Zou, Pin Luo, Yan Fu, Linhui Zhong, Chunlei Liu
{"title":"Knowledge tracking model based on recurrent neural network and transformer","authors":"Yan Cheng, Songhua Zhao, Jiansheng Hu, Haifeng Zou, Pin Luo, Yan Fu, Linhui Zhong, Chunlei Liu","doi":"10.1117/12.2680016","DOIUrl":null,"url":null,"abstract":"With the continuous development of online education platform, knowledge tracking (KT) has become a key technology to help online education platform provide personalized education. However, the existing knowledge tracking model based on recurrent neural network is difficult to be used for the input of long sequence, and has the problem of long-term dependence. Secondly, although the knowledge tracking model based on Transformer does not have the problem of long-term dependence, it is difficult to capture the input sequence information. Therefore, this paper proposes a knowledge tracking model based on recurrent neural network and transformer. A new position coding method is designed, and LSTM is used to replace the position coding method of Transformer to encode sequence features, so that the model in this paper can not only capture the input sequence information, but also get rid of the long-term dependency problem based on the recurrent neural network, and use GRU network to capture the context information. In addition, an adaptive fusion gate is designed to fuse the global features and context features obtained by Transformer, and use the fused features to predict the students' answers to the next question. In addition, an adaptive fusion gate is designed to fuse the global features and context features obtained by Transformer, and use the fused features to predict the students' answers to the next question.","PeriodicalId":201466,"journal":{"name":"Symposium on Advances in Electrical, Electronics and Computer Engineering","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Symposium on Advances in Electrical, Electronics and Computer Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2680016","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the continuous development of online education platform, knowledge tracking (KT) has become a key technology to help online education platform provide personalized education. However, the existing knowledge tracking model based on recurrent neural network is difficult to be used for the input of long sequence, and has the problem of long-term dependence. Secondly, although the knowledge tracking model based on Transformer does not have the problem of long-term dependence, it is difficult to capture the input sequence information. Therefore, this paper proposes a knowledge tracking model based on recurrent neural network and transformer. A new position coding method is designed, and LSTM is used to replace the position coding method of Transformer to encode sequence features, so that the model in this paper can not only capture the input sequence information, but also get rid of the long-term dependency problem based on the recurrent neural network, and use GRU network to capture the context information. In addition, an adaptive fusion gate is designed to fuse the global features and context features obtained by Transformer, and use the fused features to predict the students' answers to the next question. In addition, an adaptive fusion gate is designed to fuse the global features and context features obtained by Transformer, and use the fused features to predict the students' answers to the next question.
基于递归神经网络和变压器的知识跟踪模型
随着在线教育平台的不断发展,知识跟踪(KT)已经成为帮助在线教育平台提供个性化教育的关键技术。然而,现有的基于递归神经网络的知识跟踪模型难以用于长序列的输入,并且存在长期依赖问题。其次,基于Transformer的知识跟踪模型虽然不存在长期依赖问题,但难以捕获输入序列信息。为此,本文提出了一种基于递归神经网络和变压器的知识跟踪模型。设计了一种新的位置编码方法,用LSTM代替Transformer的位置编码方法对序列特征进行编码,使本文模型既能捕获输入序列信息,又能摆脱基于递归神经网络的长期依赖问题,利用GRU网络捕获上下文信息。此外,设计了一个自适应融合门,将Transformer获得的全局特征和上下文特征融合在一起,利用融合后的特征预测学生下一题的答案。此外,设计了一个自适应融合门,将Transformer获得的全局特征和上下文特征融合在一起,利用融合后的特征预测学生下一题的答案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信