{"title":"在ESL语境中使用语法检查器:对自动纠正反馈的调查","authors":"P. John, Nina Woll","doi":"10.1558/cj.36523","DOIUrl":null,"url":null,"abstract":"Our study examines written corrective feedback generated by two online grammar checkers (GCs), Grammarly and Virtual Writing Tutor, and by the grammar checking function of Microsoft Word. We tested the technology on a wide range of grammatical error types from two sources: a set of authentic ESL compositions and a series of simple sentences we generated ourselves. The GCs were evaluated in terms of (1) coverage (number of errors flagged), (2) appropriacy of proposed replacement forms, and (3) rates of “false alarms” (forms mistakenly flagged as incorrect). Although Grammarly and Virtual Writing Tutor outperformed Microsoft Word, neither of the online GCs had high rates of overall coverage (<50%). Consequently, they cannot be relied on to supply comprehensive feedback on student compositions. The finding of higher identification rates for errors from simple rather than authentic sentences reinforces this conclusion. Nonetheless, since few inaccurate replacement forms and false alarms were observed, only rarely is the feedback actively misleading. In addition, the GCs were better at handling some error types than others. Ultimately, we suggest that teachers use GCs with specially designed classroom activities that target selected error types before learners apply the technology to their own writing.","PeriodicalId":357125,"journal":{"name":"the CALICO Journal","volume":"1114 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-02-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"Using Grammar Checkers in an ESL Context: An Investigation of Automatic Corrective Feedback\",\"authors\":\"P. John, Nina Woll\",\"doi\":\"10.1558/cj.36523\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Our study examines written corrective feedback generated by two online grammar checkers (GCs), Grammarly and Virtual Writing Tutor, and by the grammar checking function of Microsoft Word. We tested the technology on a wide range of grammatical error types from two sources: a set of authentic ESL compositions and a series of simple sentences we generated ourselves. The GCs were evaluated in terms of (1) coverage (number of errors flagged), (2) appropriacy of proposed replacement forms, and (3) rates of “false alarms” (forms mistakenly flagged as incorrect). Although Grammarly and Virtual Writing Tutor outperformed Microsoft Word, neither of the online GCs had high rates of overall coverage (<50%). Consequently, they cannot be relied on to supply comprehensive feedback on student compositions. The finding of higher identification rates for errors from simple rather than authentic sentences reinforces this conclusion. Nonetheless, since few inaccurate replacement forms and false alarms were observed, only rarely is the feedback actively misleading. In addition, the GCs were better at handling some error types than others. Ultimately, we suggest that teachers use GCs with specially designed classroom activities that target selected error types before learners apply the technology to their own writing.\",\"PeriodicalId\":357125,\"journal\":{\"name\":\"the CALICO Journal\",\"volume\":\"1114 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-02-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"the CALICO Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1558/cj.36523\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"the CALICO Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1558/cj.36523","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using Grammar Checkers in an ESL Context: An Investigation of Automatic Corrective Feedback
Our study examines written corrective feedback generated by two online grammar checkers (GCs), Grammarly and Virtual Writing Tutor, and by the grammar checking function of Microsoft Word. We tested the technology on a wide range of grammatical error types from two sources: a set of authentic ESL compositions and a series of simple sentences we generated ourselves. The GCs were evaluated in terms of (1) coverage (number of errors flagged), (2) appropriacy of proposed replacement forms, and (3) rates of “false alarms” (forms mistakenly flagged as incorrect). Although Grammarly and Virtual Writing Tutor outperformed Microsoft Word, neither of the online GCs had high rates of overall coverage (<50%). Consequently, they cannot be relied on to supply comprehensive feedback on student compositions. The finding of higher identification rates for errors from simple rather than authentic sentences reinforces this conclusion. Nonetheless, since few inaccurate replacement forms and false alarms were observed, only rarely is the feedback actively misleading. In addition, the GCs were better at handling some error types than others. Ultimately, we suggest that teachers use GCs with specially designed classroom activities that target selected error types before learners apply the technology to their own writing.