Voluntary use of automated writing evaluation by content course students

IF 4.6 1区 文学 Q1 EDUCATION & EDUCATIONAL RESEARCH
Recall Pub Date : 2021-03-31 DOI:10.1017/S0958344021000021
Aysel Saricaoglu, Zeynep Bilki
{"title":"Voluntary use of automated writing evaluation by content course students","authors":"Aysel Saricaoglu, Zeynep Bilki","doi":"10.1017/S0958344021000021","DOIUrl":null,"url":null,"abstract":"Automated writing evaluation (AWE) technologies are common supplementary tools for helping students improve their language accuracy using automated feedback. In most existing studies, AWE has been implemented as a class activity or an assignment requirement in English or academic writing classes. The potential of AWE as a voluntary language learning tool is unknown. This study reports on the voluntary use of Criterion by English as a foreign language students in two content courses for two assignments. We investigated (a) to what extent students used Criterion and (b) to what extent their revisions based on automated feedback increased the accuracy of their writing from the first submitted draft to the last in both assignments. We analyzed students’ performance summary reports from Criterion using descriptive statistics and nonparametric statistical tests. The findings showed that not all students used Criterion or resubmitted a revised draft. However, the findings also showed that engagement with automated feedback significantly reduced users’ errors from the first draft to the last in 11 error categories in total for the two assignments.","PeriodicalId":47046,"journal":{"name":"Recall","volume":"1 1","pages":"1-13"},"PeriodicalIF":4.6000,"publicationDate":"2021-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1017/S0958344021000021","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Recall","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1017/S0958344021000021","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 12

Abstract

Automated writing evaluation (AWE) technologies are common supplementary tools for helping students improve their language accuracy using automated feedback. In most existing studies, AWE has been implemented as a class activity or an assignment requirement in English or academic writing classes. The potential of AWE as a voluntary language learning tool is unknown. This study reports on the voluntary use of Criterion by English as a foreign language students in two content courses for two assignments. We investigated (a) to what extent students used Criterion and (b) to what extent their revisions based on automated feedback increased the accuracy of their writing from the first submitted draft to the last in both assignments. We analyzed students’ performance summary reports from Criterion using descriptive statistics and nonparametric statistical tests. The findings showed that not all students used Criterion or resubmitted a revised draft. However, the findings also showed that engagement with automated feedback significantly reduced users’ errors from the first draft to the last in 11 error categories in total for the two assignments.
内容课程学生自愿使用自动化写作评估
自动写作评估(AWE)技术是帮助学生使用自动反馈提高语言准确性的常见补充工具。在现有的大多数研究中,AWE被作为课堂活动或英语或学术写作课的作业要求来实施。AWE作为一种自愿语言学习工具的潜力尚不清楚。本研究报告了英语作为外语的学生在两门内容课程的两项作业中自愿使用标准。我们调查了(a)学生在多大程度上使用了Criterion,以及(b)在多大程度上他们基于自动反馈的修改提高了他们从第一次提交的草稿到最后一次提交的草稿的写作准确性。我们使用描述性统计和非参数统计检验来分析来自《标准》的学生成绩总结报告。调查结果显示,并非所有学生都使用了标准或重新提交了修订后的草稿。然而,研究结果还表明,参与自动化反馈显著减少了用户在两份作业中从第一稿到最后一稿的11个错误类别。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Recall
Recall Multiple-
CiteScore
8.50
自引率
4.40%
发文量
17
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信