由变形金刚改造:为计算机教育导航人工智能编码革命:由人类进行的ITiCSE工作组

J. Prather, Paul Denny, Juho Leinonen, Brett A. Becker, Ibrahim Albluwi, M. Caspersen, Michelle Craig, H. Keuning, Natalie Kiesler, Tobias Kohn, Andrew Luxton-Reilly, S. Macneil, Andrew Petersen, Raymond Pettit, B. Reeves, Jaromír Šavelka
{"title":"由变形金刚改造:为计算机教育导航人工智能编码革命:由人类进行的ITiCSE工作组","authors":"J. Prather, Paul Denny, Juho Leinonen, Brett A. Becker, Ibrahim Albluwi, M. Caspersen, Michelle Craig, H. Keuning, Natalie Kiesler, Tobias Kohn, Andrew Luxton-Reilly, S. Macneil, Andrew Petersen, Raymond Pettit, B. Reeves, Jaromír Šavelka","doi":"10.1145/3587103.3594206","DOIUrl":null,"url":null,"abstract":"The recent advent of highly accurate and scalable large language models (LLMs) has taken the world by storm. From art to essays to computer code, LLMs are producing novel content that until recently was thought only humans could produce. Recent work in computing education has sought to understand the capabilities of LLMs for solving tasks such as writing code, explaining code, creating novel coding assignments, interpreting programming error messages, and more. However, these technologies continue to evolve at an astonishing rate leaving educators little time to adapt. This working group seeks to document the state-of-the-art for code generation LLMs, detail current opportunities and challenges related to their use, and present actionable approaches to integrating them into computing curricula.","PeriodicalId":366365,"journal":{"name":"Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 2","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Transformed by Transformers: Navigating the AI Coding Revolution for Computing Education: An ITiCSE Working Group Conducted by Humans\",\"authors\":\"J. Prather, Paul Denny, Juho Leinonen, Brett A. Becker, Ibrahim Albluwi, M. Caspersen, Michelle Craig, H. Keuning, Natalie Kiesler, Tobias Kohn, Andrew Luxton-Reilly, S. Macneil, Andrew Petersen, Raymond Pettit, B. Reeves, Jaromír Šavelka\",\"doi\":\"10.1145/3587103.3594206\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The recent advent of highly accurate and scalable large language models (LLMs) has taken the world by storm. From art to essays to computer code, LLMs are producing novel content that until recently was thought only humans could produce. Recent work in computing education has sought to understand the capabilities of LLMs for solving tasks such as writing code, explaining code, creating novel coding assignments, interpreting programming error messages, and more. However, these technologies continue to evolve at an astonishing rate leaving educators little time to adapt. This working group seeks to document the state-of-the-art for code generation LLMs, detail current opportunities and challenges related to their use, and present actionable approaches to integrating them into computing curricula.\",\"PeriodicalId\":366365,\"journal\":{\"name\":\"Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 2\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 2\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3587103.3594206\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 2","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3587103.3594206","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

最近出现的高度精确和可扩展的大型语言模型(llm)席卷了世界。从艺术到论文再到计算机代码,法学硕士们正在创作直到最近还被认为只有人类才能创作的新颖内容。最近在计算机教育方面的工作试图了解法学硕士解决诸如编写代码、解释代码、创建新的编码任务、解释编程错误信息等任务的能力。然而,这些技术继续以惊人的速度发展,教育工作者几乎没有时间去适应。该工作组旨在记录代码生成法学硕士的最新技术,详细说明当前与它们的使用相关的机遇和挑战,并提出将它们整合到计算机课程中的可行方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Transformed by Transformers: Navigating the AI Coding Revolution for Computing Education: An ITiCSE Working Group Conducted by Humans
The recent advent of highly accurate and scalable large language models (LLMs) has taken the world by storm. From art to essays to computer code, LLMs are producing novel content that until recently was thought only humans could produce. Recent work in computing education has sought to understand the capabilities of LLMs for solving tasks such as writing code, explaining code, creating novel coding assignments, interpreting programming error messages, and more. However, these technologies continue to evolve at an astonishing rate leaving educators little time to adapt. This working group seeks to document the state-of-the-art for code generation LLMs, detail current opportunities and challenges related to their use, and present actionable approaches to integrating them into computing curricula.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信