探讨语言复杂性对第二语言写作评价的跨语言影响

IF 5.5 1区 文学 Q1 EDUCATION & EDUCATIONAL RESEARCH
Sara Geremia , Thomas Gaillat , Nicolas Ballier , Andrew J. Simpkin
{"title":"探讨语言复杂性对第二语言写作评价的跨语言影响","authors":"Sara Geremia ,&nbsp;Thomas Gaillat ,&nbsp;Nicolas Ballier ,&nbsp;Andrew J. Simpkin","doi":"10.1016/j.asw.2025.100951","DOIUrl":null,"url":null,"abstract":"<div><div>This paper explores the influence of L1 on the linguistic complexity of English learners. It relies on features extracted from texts and modelled using a statistical learning framework. Linguistic complexity is assessed automatically in terms of proficiency levels across different L1. We investigate whether proficiency grading by humans matches clusters of learner writings based on the similarity of linguistic features. We then use complexity metrics to automatically assess proficiency levels in samples of writings of different L1s. We focus on variable importance to understand which features best discriminate between levels. Analytic clusters of linguistic complexity data do not map well to learning levels, which promises poorly for the relevance of using language complexity metrics for level prediction. However, assessing L1 influence on linguistic complexity through a multinomial logistic regression with elastic net regularisation shows significant results. The models predict the proficiency levels of students of different L1s.</div></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"66 ","pages":"Article 100951"},"PeriodicalIF":5.5000,"publicationDate":"2025-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Exploring the cross-lingual influence of linguistic complexity in second language writing assessment\",\"authors\":\"Sara Geremia ,&nbsp;Thomas Gaillat ,&nbsp;Nicolas Ballier ,&nbsp;Andrew J. Simpkin\",\"doi\":\"10.1016/j.asw.2025.100951\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This paper explores the influence of L1 on the linguistic complexity of English learners. It relies on features extracted from texts and modelled using a statistical learning framework. Linguistic complexity is assessed automatically in terms of proficiency levels across different L1. We investigate whether proficiency grading by humans matches clusters of learner writings based on the similarity of linguistic features. We then use complexity metrics to automatically assess proficiency levels in samples of writings of different L1s. We focus on variable importance to understand which features best discriminate between levels. Analytic clusters of linguistic complexity data do not map well to learning levels, which promises poorly for the relevance of using language complexity metrics for level prediction. However, assessing L1 influence on linguistic complexity through a multinomial logistic regression with elastic net regularisation shows significant results. The models predict the proficiency levels of students of different L1s.</div></div>\",\"PeriodicalId\":46865,\"journal\":{\"name\":\"Assessing Writing\",\"volume\":\"66 \",\"pages\":\"Article 100951\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2025-08-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Assessing Writing\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1075293525000388\",\"RegionNum\":1,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293525000388","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

摘要

本文探讨了母语对英语学习者语言复杂性的影响。它依赖于从文本中提取的特征,并使用统计学习框架建模。语言复杂性是根据不同语言的熟练程度自动评估的。我们根据语言特征的相似性来研究人类的熟练程度评分是否与学习者的写作相匹配。然后,我们使用复杂性度量来自动评估不同l15写作样本的熟练程度。我们关注变量重要性,以了解哪些特征最能区分不同级别。语言复杂性数据的分析聚类不能很好地映射到学习水平,这对于使用语言复杂性指标进行水平预测的相关性很差。然而,通过弹性网络正则化的多项逻辑回归评估L1对语言复杂性的影响显示出显著的结果。模型预测了不同年级学生的熟练程度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Exploring the cross-lingual influence of linguistic complexity in second language writing assessment
This paper explores the influence of L1 on the linguistic complexity of English learners. It relies on features extracted from texts and modelled using a statistical learning framework. Linguistic complexity is assessed automatically in terms of proficiency levels across different L1. We investigate whether proficiency grading by humans matches clusters of learner writings based on the similarity of linguistic features. We then use complexity metrics to automatically assess proficiency levels in samples of writings of different L1s. We focus on variable importance to understand which features best discriminate between levels. Analytic clusters of linguistic complexity data do not map well to learning levels, which promises poorly for the relevance of using language complexity metrics for level prediction. However, assessing L1 influence on linguistic complexity through a multinomial logistic regression with elastic net regularisation shows significant results. The models predict the proficiency levels of students of different L1s.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Assessing Writing
Assessing Writing Multiple-
CiteScore
6.00
自引率
17.90%
发文量
67
期刊介绍: Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信