Code Completion for Programming Education based on Recurrent Neural Network

Kenta Terada, Y. Watanobe
{"title":"Code Completion for Programming Education based on Recurrent Neural Network","authors":"Kenta Terada, Y. Watanobe","doi":"10.1109/IWCIA47330.2019.8955090","DOIUrl":null,"url":null,"abstract":"In solving programming problems, it is difficult for beginners to create program code from scratch. One way to navigate this difficulty is to provide a function of automatic code completion. In this work, we propose a method to predict the next word following a given incomplete program that has two key constituents, prediction of within-vocabulary words and prediction of identifiers. In terms of predicting within-vocabulary words, a neural language model based on a Long Short-Term Memory (LSTM) network is proposed. Regarding the prediction of identifiers, a model based on a pointer network is proposed. Additionally, a model for switching between these two models is proposed. For evaluation of the proposed method, source code accumulated in an online judge system is used. The results of the experiment demonstrate that the proposed method can predict both the next within-vocabulary word and the next identifier to a high degree of accuracy.","PeriodicalId":139434,"journal":{"name":"2019 IEEE 11th International Workshop on Computational Intelligence and Applications (IWCIA)","volume":"459 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 11th International Workshop on Computational Intelligence and Applications (IWCIA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWCIA47330.2019.8955090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

Abstract

In solving programming problems, it is difficult for beginners to create program code from scratch. One way to navigate this difficulty is to provide a function of automatic code completion. In this work, we propose a method to predict the next word following a given incomplete program that has two key constituents, prediction of within-vocabulary words and prediction of identifiers. In terms of predicting within-vocabulary words, a neural language model based on a Long Short-Term Memory (LSTM) network is proposed. Regarding the prediction of identifiers, a model based on a pointer network is proposed. Additionally, a model for switching between these two models is proposed. For evaluation of the proposed method, source code accumulated in an online judge system is used. The results of the experiment demonstrate that the proposed method can predict both the next within-vocabulary word and the next identifier to a high degree of accuracy.
基于递归神经网络的编程教育的代码完成
在解决编程问题时,初学者很难从头开始编写程序代码。解决这个困难的一种方法是提供自动代码完成功能。在这项工作中,我们提出了一种方法来预测给定的不完整程序之后的下一个单词,该方法有两个关键组成部分,词汇内单词的预测和标识符的预测。在词汇内词预测方面,提出了一种基于长短期记忆(LSTM)网络的神经语言模型。针对标识符的预测,提出了一种基于指针网络的标识符预测模型。在此基础上,提出了在这两种模型之间切换的模型。为了对所提出的方法进行评估,使用了在线判断系统中积累的源代码。实验结果表明,该方法既能准确地预测下一个词,又能准确地预测下一个词。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信