Toward Effective Courseware at Scale: Investigating Automatically Generated Questions as Formative Practice

Rachel Van Campenhout, Noam Brown, Bill Jerome, Jeffrey S. Dittel, Benny G. Johnson
{"title":"Toward Effective Courseware at Scale: Investigating Automatically Generated Questions as Formative Practice","authors":"Rachel Van Campenhout, Noam Brown, Bill Jerome, Jeffrey S. Dittel, Benny G. Johnson","doi":"10.1145/3430895.3460162","DOIUrl":null,"url":null,"abstract":"Courseware is a comprehensive learning environment that engages students in a learning by doing approach while also giving instructors data-driven insights on their class, providing a scalable solution for many instructional models. However, courseware-and the volume of formative questions required to make it effective-is time-consuming and expensive to create. By using artificial intelligence for automatic question generation, we can reduce the time and cost of developing formative questions in courseware. However, it is critical that automatically generated (AG) questions have a level of quality on par with human-authored (HA) questions in order to be confident in their usage at scale. Therefore, our research question is: are student interactions with AG questions equivalent to HA questions with respect to engagement, difficulty, and persistence metrics? This paper evaluates data for AG and HA questions that students used as formative practice in their university Communication course. Analysis of AG and HA questions shows that our first generation of AG questions perform equally well as HA questions in multiple important respects.","PeriodicalId":125581,"journal":{"name":"Proceedings of the Eighth ACM Conference on Learning @ Scale","volume":"103 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Eighth ACM Conference on Learning @ Scale","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3430895.3460162","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

Courseware is a comprehensive learning environment that engages students in a learning by doing approach while also giving instructors data-driven insights on their class, providing a scalable solution for many instructional models. However, courseware-and the volume of formative questions required to make it effective-is time-consuming and expensive to create. By using artificial intelligence for automatic question generation, we can reduce the time and cost of developing formative questions in courseware. However, it is critical that automatically generated (AG) questions have a level of quality on par with human-authored (HA) questions in order to be confident in their usage at scale. Therefore, our research question is: are student interactions with AG questions equivalent to HA questions with respect to engagement, difficulty, and persistence metrics? This paper evaluates data for AG and HA questions that students used as formative practice in their university Communication course. Analysis of AG and HA questions shows that our first generation of AG questions perform equally well as HA questions in multiple important respects.
走向有效的大规模课件:调查自动生成的问题作为形成性实践
课件是一个综合性的学习环境,它让学生参与到“边做边学”的方法中,同时也为教师提供了基于数据的课堂洞察,为许多教学模式提供了可扩展的解决方案。然而,课件——以及使其有效所需的大量形成性问题——制作起来既耗时又昂贵。通过使用人工智能自动生成问题,我们可以减少在课件中开发形成性问题的时间和成本。然而,自动生成(AG)问题的质量必须达到与人工编写(HA)问题相当的水平,这样才能对它们的大规模使用有信心。因此,我们的研究问题是:在参与度、难度和持久性指标方面,学生与AG问题的互动是否等同于HA问题?本文评估了学生在大学传播学课程中作为形成性实践的AG和HA问题的数据。对AG和HA问题的分析表明,我们的第一代AG问题在多个重要方面表现得与HA问题一样好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信