{"title":"Examining the direct and indirect impacts of verbatim source use on linguistic complexity in integrated argumentative writing assessment","authors":"Huiying Cai , Xun Yan","doi":"10.1016/j.asw.2024.100868","DOIUrl":null,"url":null,"abstract":"<div><p>Verbatim source use (VSU) in integrated argumentative writing tasks may enhance linguistic complexity of writing performance. This assistance might present an unequal advantage for test-takers across levels of writing proficiency, engendering validity and fairness concerns. While previous research has mostly examined the relationships between source use characteristics and proficiency levels, the relationship between VSU and linguistic complexity remains underexplored. To further unpack these relationships, this study examined both the direct impact of VSU on linguistic complexity of writing performances and its indirect impact through interaction with writing proficiency. Using natural language processing tools and techniques, we examined 34 linguistic complexity features and three VSU features of 3250 argumentative writing performances on a university-level English Placement Test (EPT). We performed exploratory factor analysis to identify linguistic complexity dimensions and applied mixed-effect models to examine how VSU features and proficiency level impacted these dimensions. Post-hoc analyses suggested weak direct impacts of different VSU features on linguistic complexity, which might reflect different essay writing strategies. However, no meaningful indirect impact was found. The findings help unravel the impact of VSU on argumentative writing and provide empirical evidence for validity arguments for integrated writing assessments.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100868"},"PeriodicalIF":4.2000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293524000618/pdfft?md5=faa5261e073280115613145d7ef0bb9e&pid=1-s2.0-S1075293524000618-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293524000618","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Verbatim source use (VSU) in integrated argumentative writing tasks may enhance linguistic complexity of writing performance. This assistance might present an unequal advantage for test-takers across levels of writing proficiency, engendering validity and fairness concerns. While previous research has mostly examined the relationships between source use characteristics and proficiency levels, the relationship between VSU and linguistic complexity remains underexplored. To further unpack these relationships, this study examined both the direct impact of VSU on linguistic complexity of writing performances and its indirect impact through interaction with writing proficiency. Using natural language processing tools and techniques, we examined 34 linguistic complexity features and three VSU features of 3250 argumentative writing performances on a university-level English Placement Test (EPT). We performed exploratory factor analysis to identify linguistic complexity dimensions and applied mixed-effect models to examine how VSU features and proficiency level impacted these dimensions. Post-hoc analyses suggested weak direct impacts of different VSU features on linguistic complexity, which might reflect different essay writing strategies. However, no meaningful indirect impact was found. The findings help unravel the impact of VSU on argumentative writing and provide empirical evidence for validity arguments for integrated writing assessments.
期刊介绍:
Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.