Sterett H. Mercer, Joanna Cannon, Bonita Squires, Yue Guo, Ella Pinco
{"title":"基于测量评分的自动书面表达课程的准确性","authors":"Sterett H. Mercer, Joanna Cannon, Bonita Squires, Yue Guo, Ella Pinco","doi":"10.1177/0829573520987753","DOIUrl":null,"url":null,"abstract":"We examined the extent to which automated written expression curriculum-based measurement (aWE-CBM) can be accurately used to computer score student writing samples for screening and progress monitoring. Students (n = 174) with learning difficulties in Grades 1 to 12 who received 1:1 academic tutoring through a community-based organization completed narrative writing samples in the fall and spring across two academic years. The samples were evaluated using four automated and hand-calculated WE-CBM scoring metrics. Results indicated automated and hand-calculated scores were highly correlated at all four timepoints for counts of total words written (rs = 1.00), words spelled correctly (rs = .99–1.00), correct word sequences (CWS; rs = .96–.97), and correct minus incorrect word sequences (CIWS; rs = .86–.92). For CWS and CIWS, however, automated scores systematically overestimated hand-calculated scores, with an unacceptable amount of error for CIWS for some types of decisions. These findings provide preliminary evidence that aWE-CBM can be used to efficiently score narrative writing samples, potentially improving the feasibility of implementing multi-tiered systems of support in which the written expression skills of large numbers of students are screened and monitored.","PeriodicalId":46445,"journal":{"name":"Canadian Journal of School Psychology","volume":"36 1","pages":"304 - 317"},"PeriodicalIF":3.3000,"publicationDate":"2020-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/0829573520987753","citationCount":"1","resultStr":"{\"title\":\"Accuracy of Automated Written Expression Curriculum-Based Measurement Scoring\",\"authors\":\"Sterett H. Mercer, Joanna Cannon, Bonita Squires, Yue Guo, Ella Pinco\",\"doi\":\"10.1177/0829573520987753\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We examined the extent to which automated written expression curriculum-based measurement (aWE-CBM) can be accurately used to computer score student writing samples for screening and progress monitoring. Students (n = 174) with learning difficulties in Grades 1 to 12 who received 1:1 academic tutoring through a community-based organization completed narrative writing samples in the fall and spring across two academic years. The samples were evaluated using four automated and hand-calculated WE-CBM scoring metrics. Results indicated automated and hand-calculated scores were highly correlated at all four timepoints for counts of total words written (rs = 1.00), words spelled correctly (rs = .99–1.00), correct word sequences (CWS; rs = .96–.97), and correct minus incorrect word sequences (CIWS; rs = .86–.92). For CWS and CIWS, however, automated scores systematically overestimated hand-calculated scores, with an unacceptable amount of error for CIWS for some types of decisions. These findings provide preliminary evidence that aWE-CBM can be used to efficiently score narrative writing samples, potentially improving the feasibility of implementing multi-tiered systems of support in which the written expression skills of large numbers of students are screened and monitored.\",\"PeriodicalId\":46445,\"journal\":{\"name\":\"Canadian Journal of School Psychology\",\"volume\":\"36 1\",\"pages\":\"304 - 317\"},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2020-12-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1177/0829573520987753\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Canadian Journal of School Psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1177/0829573520987753\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EDUCATIONAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Canadian Journal of School Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/0829573520987753","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EDUCATIONAL","Score":null,"Total":0}
Accuracy of Automated Written Expression Curriculum-Based Measurement Scoring
We examined the extent to which automated written expression curriculum-based measurement (aWE-CBM) can be accurately used to computer score student writing samples for screening and progress monitoring. Students (n = 174) with learning difficulties in Grades 1 to 12 who received 1:1 academic tutoring through a community-based organization completed narrative writing samples in the fall and spring across two academic years. The samples were evaluated using four automated and hand-calculated WE-CBM scoring metrics. Results indicated automated and hand-calculated scores were highly correlated at all four timepoints for counts of total words written (rs = 1.00), words spelled correctly (rs = .99–1.00), correct word sequences (CWS; rs = .96–.97), and correct minus incorrect word sequences (CIWS; rs = .86–.92). For CWS and CIWS, however, automated scores systematically overestimated hand-calculated scores, with an unacceptable amount of error for CIWS for some types of decisions. These findings provide preliminary evidence that aWE-CBM can be used to efficiently score narrative writing samples, potentially improving the feasibility of implementing multi-tiered systems of support in which the written expression skills of large numbers of students are screened and monitored.
期刊介绍:
The Canadian Journals of School Psychology (CJSP) is the official journal of the Canadian Association of School Psychologists and publishes papers focusing on the interface between psychology and education. Papers may reflect theory, research, and practice of psychology in education, as well as book and test reviews. The journal is aimed at practitioners, but is subscribed to by university libraries and individuals (i.e. psychologists). CJSP has become the major reference for practicing school psychologists and students in graduate educational and school psychology programs in Canada.