AutoStyle: Toward Coding Style Feedback at Scale

J. Moghadam, R. R. Choudhury, Hezheng Yin, A. Fox
{"title":"AutoStyle: Toward Coding Style Feedback at Scale","authors":"J. Moghadam, R. R. Choudhury, Hezheng Yin, A. Fox","doi":"10.1145/2724660.2728672","DOIUrl":null,"url":null,"abstract":"While large-scale automatic grading of student programs for correctness is widespread, less effort has focused on automating feedback for good programming style:} the tasteful use of language features and idioms to produce code that is not only correct, but also concise, elegant, and revealing of design intent. We hypothesize that with a large enough (MOOC-sized) corpus of submissions to a given programming problem, we can observe a range of stylistic mastery from naïve to expert, and many points in between, and that we can exploit this continuum to automatically provide hints to learners for improving their code style based on the key stylistic differences between a given learner's submission and a submission that is stylistically slightly better. We are developing a methodology for analyzing and doing feature engineering on differences between submissions, and for learning from instructor-provided feedback as to which hints are most relevant. We describe the techniques used to do this in our prototype, which will be deployed in a residential software engineering course as an alpha test prior to deploying in a MOOC later this year.","PeriodicalId":20664,"journal":{"name":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","volume":"28 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2015-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Second (2015) ACM Conference on Learning @ Scale","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2724660.2728672","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14

Abstract

While large-scale automatic grading of student programs for correctness is widespread, less effort has focused on automating feedback for good programming style:} the tasteful use of language features and idioms to produce code that is not only correct, but also concise, elegant, and revealing of design intent. We hypothesize that with a large enough (MOOC-sized) corpus of submissions to a given programming problem, we can observe a range of stylistic mastery from naïve to expert, and many points in between, and that we can exploit this continuum to automatically provide hints to learners for improving their code style based on the key stylistic differences between a given learner's submission and a submission that is stylistically slightly better. We are developing a methodology for analyzing and doing feature engineering on differences between submissions, and for learning from instructor-provided feedback as to which hints are most relevant. We describe the techniques used to do this in our prototype, which will be deployed in a residential software engineering course as an alpha test prior to deploying in a MOOC later this year.
AutoStyle:面向编码风格的大规模反馈
虽然对学生程序的正确性进行大规模的自动评分是很普遍的,但对良好编程风格的自动化反馈的关注却很少:有品味地使用语言特性和习惯用语来生成不仅正确,而且简洁、优雅、揭示设计意图的代码。我们假设,对于给定的编程问题,有足够大的(mooc大小的)提交语料库,我们可以观察到从naïve到专家的风格掌握范围,以及介于两者之间的许多点,并且我们可以利用这个连续体来自动为学习者提供提示,以改进他们的代码风格,基于给定学习者提交的内容和风格稍好的提交之间的关键风格差异。我们正在开发一种方法,用于分析和执行提交之间差异的特征工程,并从讲师提供的反馈中学习哪些提示最相关。我们在我们的原型中描述了用于此目的的技术,该原型将在今年晚些时候部署在MOOC之前部署在住宅软件工程课程中作为alpha测试。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信