{"title":"二语写作表现的语言预测因素:不同体裁的差异","authors":"Weiwei Yang , Sara T. Cushing , Guoxing Yu","doi":"10.1016/j.asw.2025.100985","DOIUrl":null,"url":null,"abstract":"<div><div>This study investigated how linguistic complexity (including lexical and syntactic complexity), accuracy, and fluency (CAF) predicted second language (L2) writing scores across four essay genres: narration, exposition, expo-argumentation and argumentation. Approximately 60 essays were collected on each of these genres on the same subject matter and were scored using a holistic rubric. Eight measures of complexity, accuracy and fluency were examined. Forward stepwise regression analysis based on Akaike Information Criterion Corrected (AICC) was conducted for each genre. The findings revealed a large amount of score variance explained by CAF: 61 % for the argumentative task and about 70 % for the other three tasks. Fluency was found to be a highly important score predictor for the narrative and expository tasks, while lexical sophistication was equally important or more important than fluency for the expo-argumentative and argumentative tasks. The regression model for the narrative task also differed from those for the expository, argumentative task types, regarding syntactic complexity predictors. Lexical diversity was generally less important in predicting scores than lexical sophistication. The implications of the findings for L2 writing scoring and automated essay scoring are discussed.</div></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"66 ","pages":"Article 100985"},"PeriodicalIF":5.5000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Linguistic predictors of L2 writing performance: Variations across genres\",\"authors\":\"Weiwei Yang , Sara T. Cushing , Guoxing Yu\",\"doi\":\"10.1016/j.asw.2025.100985\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This study investigated how linguistic complexity (including lexical and syntactic complexity), accuracy, and fluency (CAF) predicted second language (L2) writing scores across four essay genres: narration, exposition, expo-argumentation and argumentation. Approximately 60 essays were collected on each of these genres on the same subject matter and were scored using a holistic rubric. Eight measures of complexity, accuracy and fluency were examined. Forward stepwise regression analysis based on Akaike Information Criterion Corrected (AICC) was conducted for each genre. The findings revealed a large amount of score variance explained by CAF: 61 % for the argumentative task and about 70 % for the other three tasks. Fluency was found to be a highly important score predictor for the narrative and expository tasks, while lexical sophistication was equally important or more important than fluency for the expo-argumentative and argumentative tasks. The regression model for the narrative task also differed from those for the expository, argumentative task types, regarding syntactic complexity predictors. Lexical diversity was generally less important in predicting scores than lexical sophistication. The implications of the findings for L2 writing scoring and automated essay scoring are discussed.</div></div>\",\"PeriodicalId\":46865,\"journal\":{\"name\":\"Assessing Writing\",\"volume\":\"66 \",\"pages\":\"Article 100985\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2025-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Assessing Writing\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1075293525000728\",\"RegionNum\":1,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293525000728","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
Linguistic predictors of L2 writing performance: Variations across genres
This study investigated how linguistic complexity (including lexical and syntactic complexity), accuracy, and fluency (CAF) predicted second language (L2) writing scores across four essay genres: narration, exposition, expo-argumentation and argumentation. Approximately 60 essays were collected on each of these genres on the same subject matter and were scored using a holistic rubric. Eight measures of complexity, accuracy and fluency were examined. Forward stepwise regression analysis based on Akaike Information Criterion Corrected (AICC) was conducted for each genre. The findings revealed a large amount of score variance explained by CAF: 61 % for the argumentative task and about 70 % for the other three tasks. Fluency was found to be a highly important score predictor for the narrative and expository tasks, while lexical sophistication was equally important or more important than fluency for the expo-argumentative and argumentative tasks. The regression model for the narrative task also differed from those for the expository, argumentative task types, regarding syntactic complexity predictors. Lexical diversity was generally less important in predicting scores than lexical sophistication. The implications of the findings for L2 writing scoring and automated essay scoring are discussed.
期刊介绍:
Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.