Assessing Writing最新文献

筛选
英文 中文
Thirty years of writing assessment: A bibliometric analysis of research trends and future directions 写作评估三十年:对研究趋势和未来方向的文献计量分析
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-06-07 DOI: 10.1016/j.asw.2024.100862
Jihua Dong , Yanan Zhao , Louisa Buckingham
{"title":"Thirty years of writing assessment: A bibliometric analysis of research trends and future directions","authors":"Jihua Dong ,&nbsp;Yanan Zhao ,&nbsp;Louisa Buckingham","doi":"10.1016/j.asw.2024.100862","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100862","url":null,"abstract":"<div><p>This study employs a bibliometric analysis to identify the research trends in the field of writing assessment over the last 30 years (1993–2022). Employing a dataset of 1,712 articles and 52,092 unique references, keyword co-occurrence analyses were used to identify prominent research topics, co-citation analyses were conducted to identify influential publications and journals, and a structural variation analysis was employed to identify transformative research in recent years. The results revealed the growing popularity of the writing assessment field, and the increasing diversity of research topics in the field. The research trends have become more associated with technology and cognitive and metacognitive processes. The influential publications indicate changes in research interest towards cross-disciplinary publications. The journals identified as key venues for writing assessment research also changed across the three decades. The latest transformative research points out possible future directions, including the integration of computational methods in writing assessment, and investigations into relationships between writing quality and various factors. This study contributes to our understanding of the development and future directions of writing assessment research, and has implications for researchers and practitioners.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100862"},"PeriodicalIF":3.9,"publicationDate":"2024-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141286045","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
EvaluMate: Using AI to support students’ feedback provision in peer assessment for writing EvaluMate:使用人工智能支持学生在写作互评中提供反馈
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-05-31 DOI: 10.1016/j.asw.2024.100864
Kai Guo
{"title":"EvaluMate: Using AI to support students’ feedback provision in peer assessment for writing","authors":"Kai Guo","doi":"10.1016/j.asw.2024.100864","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100864","url":null,"abstract":"<div><p>Peer feedback plays an important role in promoting learning in the writing classroom. However, providing high-quality feedback can be demanding for student reviewers. To address this challenge, this article proposes an AI-enhanced approach to peer feedback provision. I introduce EvaluMate, a newly developed online peer review system that leverages ChatGPT, a large language model (LLM), to scaffold student reviewers’ feedback generation. I discuss the design and functionality of EvaluMate, highlighting its affordances in supporting student reviewers’ provision of comments on peers’ essays. I also address the system’s limitations and propose potential solutions. Furthermore, I recommend future research on students’ engagement with this learning approach and its impact on learning outcomes. By presenting EvaluMate, I aim to inspire researchers and practitioners to explore the potential of AI technology in the teaching, learning, and assessment of writing.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100864"},"PeriodicalIF":3.9,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141243092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Comparing Chinese L2 writing performance in paper-based and computer-based modes: Perspectives from the writing product and process 比较纸质模式和电脑模式下的中文第二语言写作成绩:从写作产品和过程的角度看问题
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-05-31 DOI: 10.1016/j.asw.2024.100849
Xiaozhu Wang, Jimin Wang
{"title":"Comparing Chinese L2 writing performance in paper-based and computer-based modes: Perspectives from the writing product and process","authors":"Xiaozhu Wang,&nbsp;Jimin Wang","doi":"10.1016/j.asw.2024.100849","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100849","url":null,"abstract":"<div><p>As writing is a complex language-producing process dependent on the writing environment and medium, the comparability of computer-based (CB) and paper-based (PB) writing assessments has been studied extensively since the emergence of computer-based language writing assessment. This study investigated the differences in the writing product and process between CB and PB modes of writing assessment in Chinese as a second language, of which the character writing system is considered challenging for learners. The many-facet Rasch model (MFRM) was adopted to reveal the text quality differences. Keystrokes and handwriting trace data were utilized to unveil insights into the writing process. The results showed that Chinese L2 learners generated higher-quality texts with fewer character mistakes in the CB mode. They revised much more, paused shorter and less frequently between lower-level linguistic units in the CB mode. The quality of CB text is associated with revision behavior, whereas pause duration serves as a stronger predictor of PB text quality. The findings suggest that the act of handwriting Chinese characters makes the construct of PB distinct from the CB writing assessment in L2 Chinese. Thus, the setting of the assessment mode should consider the target language use and the test taker’s characteristics.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100849"},"PeriodicalIF":3.9,"publicationDate":"2024-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141243091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A teacher’s inquiry into diagnostic assessment in an EAP writing course 一位教师对 EAP 写作课程诊断评估的探究
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-05-30 DOI: 10.1016/j.asw.2024.100848
Rabail Qayyum
{"title":"A teacher’s inquiry into diagnostic assessment in an EAP writing course","authors":"Rabail Qayyum","doi":"10.1016/j.asw.2024.100848","DOIUrl":"10.1016/j.asw.2024.100848","url":null,"abstract":"<div><p>Research into diagnostic assessment of writing has largely ignored how diagnostic feedback information leads to differentiated instruction and learning. This case study research presents a teacher’s account of validating an in-house diagnostic assessment procedure in an English for Academic Purposes writing course with a view to refining it. I developed a validity argument and gathered and interpreted related evidence, focusing on one student’s performance in and perception of the assessment. The analysis revealed that to an extent the absence of proper feedback mechanisms limited the use of the test, somewhat weakened its impact, and reduced the potential for learning. I propose a modification to the assessment procedure involving a sample student feedback report.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100848"},"PeriodicalIF":3.9,"publicationDate":"2024-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141188259","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Construct representation and predictive validity of integrated writing tasks: A study on the writing component of the Duolingo English Test 综合写作任务的结构表征和预测有效性:对 Duolingo 英语测试写作部分的研究
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-05-28 DOI: 10.1016/j.asw.2024.100846
Qin Xie
{"title":"Construct representation and predictive validity of integrated writing tasks: A study on the writing component of the Duolingo English Test","authors":"Qin Xie","doi":"10.1016/j.asw.2024.100846","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100846","url":null,"abstract":"<div><p>This study examined whether two integrated reading-to-write tasks could broaden the construct representation of the writing component of <em>Duolingo English Test</em> (DET). It also verified whether they could enhance DET’s predictive power of English academic writing in universities. The tasks were (1) writing a summary based on two source texts and (2) writing a reading-to-write essay based on five texts. Both were given to a sample (N = 204) of undergraduates from Hong Kong. Each participant also submitted an academic assignment written for the assessment of a disciplinary course. Three professional raters double-marked all writing samples against detailed analytical rubrics. Raw scores were first processed using Multi-Faceted Rasch Measurement to estimate inter- and intra-rater consistency and generate adjusted (fair) measures. Based on these measures, descriptive analyses, sequential multiple regression, and Structural Equation Modeling were conducted (in that order). The analyses verified the writing tasks’ underlying component constructs and assessed their relative contributions to the overall integrated writing scores. Both tasks were found to contribute to DET’s construct representation and add moderate predictive power to the domain performance. The findings, along with their practical implications, are discussed, especially regarding the complex relations between construct representation and predictive validity.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100846"},"PeriodicalIF":3.9,"publicationDate":"2024-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293524000394/pdfft?md5=1959b9ed8a9acc732d6a5985fba62520&pid=1-s2.0-S1075293524000394-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141243090","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
How syntactic complexity indices predict Chinese L2 writing quality: An analysis of unified dependency syntactically-annotated corpus 句法复杂性指数如何预测中文 L2 写作质量?统一依存句法注释语料分析
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-05-16 DOI: 10.1016/j.asw.2024.100847
Yuxin Hao , Xuelin Wang , Shuai Bin , Qihao Yang , Haitao Liu
{"title":"How syntactic complexity indices predict Chinese L2 writing quality: An analysis of unified dependency syntactically-annotated corpus","authors":"Yuxin Hao ,&nbsp;Xuelin Wang ,&nbsp;Shuai Bin ,&nbsp;Qihao Yang ,&nbsp;Haitao Liu","doi":"10.1016/j.asw.2024.100847","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100847","url":null,"abstract":"<div><p>Previous syntactic complexity (SC) research on L2 Chinese has overlooked a range of Chinese-specific structures and fine-grained indices. This study, utilizing a syntactically annotated Chinese L2 writing corpus, simultaneously employs both large-grained and fine-grained syntactic complexity indices to investigate the relationship between syntactic complexity and writing quality produced by English-speaking Chinese second language (ECSL) learners from macro and micro perspectives. Our findings reveal the following: (a) at a large-grained level of analysis using syntactic complexity indices, the generic syntactic complexity indice (GSC indice) number of T-units per sentence and the Chinese-specific syntactic complexity indice (CSC indice) number of Clauses per topic chain unit account for 14.5% of the total variance in writing scores among ECSL learners; (b) the syntactic diversity model alone accounts for 24.7% of the variance in Chinese writing scores among ECSL learners; (c) the stepwise regression analysis model, which integrates fine-grained SC indices extracted from the syntactically annotated corpus, explains 43.7% of the variance in Chinese writing quality. This model incorporates CSC indices such as average ratio of dependency types per 30 dependency segments, the ratio of adjuncts to sentence end, the ratio of predicate complements, the ratio of numeral adjuncts, the mean length of Topic-Comment-Unit dependency distance, as well as GSC indices like the ratio of main governors, the ratio of attributers, the ratio of coordinating adjuncts, and the ratio of sentential objects. These findings highlight the valuable insights that syntactically annotated fine-grained SC indices offer regarding the writing characteristics of ECSL learners.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"61 ","pages":"Article 100847"},"PeriodicalIF":3.9,"publicationDate":"2024-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140952436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Visualizing formative feedback in statistics writing: An exploratory study of student motivation using DocuScope Write & Audit 统计写作中的可视化形成性反馈:使用 DocuScope Write & Audit 对学生写作动机进行探索性研究
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-04-01 DOI: 10.1016/j.asw.2024.100830
Michael Laudenbach, David West Brown, Zhiyu Guo, Suguru Ishizaki, Alex Reinhart, Gordon Weinberg
{"title":"Visualizing formative feedback in statistics writing: An exploratory study of student motivation using DocuScope Write & Audit","authors":"Michael Laudenbach,&nbsp;David West Brown,&nbsp;Zhiyu Guo,&nbsp;Suguru Ishizaki,&nbsp;Alex Reinhart,&nbsp;Gordon Weinberg","doi":"10.1016/j.asw.2024.100830","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100830","url":null,"abstract":"<div><p>Recently, formative feedback in writing instruction has been supported by technologies generally referred to as Automated Writing Evaluation tools. However, such tools are limited in their capacity to explore specific disciplinary genres, and they have shown mixed results in student writing improvement. We explore how technology-enhanced writing interventions can positively affect student attitudes toward and beliefs about writing, both reinforcing content knowledge and increasing student motivation. Using a student-facing text-visualization tool called <em>Write &amp; Audit</em>, we hosted revision workshops for students (n = 30) in an introductory-level statistics course at a large North American University. The tool is designed to be flexible: instructors of various courses can create expectations and predefine topics that are genre-specific. In this way, students are offered non-evaluative formative feedback which redirects them to field-specific strategies. To gauge the usefulness of Write &amp; Audit, we used a previously validated survey instrument designed to measure the construct model of student motivation (Ling et al. 2021). Our results show significant increases in student self-efficacy and beliefs about the importance of content in successful writing. We contextualize these findings with data from three student think-aloud interviews, which demonstrate metacognitive awareness while using the tool. Ultimately, this exploratory study is non-experimental, but it contributes a novel approach to automated formative feedback and confirms the promising potential of Write &amp; Audit.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"60 ","pages":"Article 100830"},"PeriodicalIF":3.9,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1075293524000230/pdfft?md5=7f031636dffbbdcdb70229b30498cf92&pid=1-s2.0-S1075293524000230-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140330991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Engagement with supervisory feedback on master’s theses: Do supervisors and students see eye to eye? 参与导师对硕士论文的反馈:导师和学生意见一致吗?
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-04-01 DOI: 10.1016/j.asw.2024.100841
Madhu Neupane Bastola , Guangwei Hu
{"title":"Engagement with supervisory feedback on master’s theses: Do supervisors and students see eye to eye?","authors":"Madhu Neupane Bastola ,&nbsp;Guangwei Hu","doi":"10.1016/j.asw.2024.100841","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100841","url":null,"abstract":"<div><p>Student engagement has attracted much research attention in higher education because of various potential benefits associated with improved engagement. Despite extensive research on student engagement in higher education, little has been written about graduate students’ engagement with supervisory feedback. This paper reports on a study on student engagement with supervisory feedback on master’s theses conducted in the context of Nepalese higher education. The study employed an exploratory sequential mixed-methods design that drew on interviews and a questionnaire-based survey involving supervisors and students from four disciplines at a comprehensive university in Nepal. Analyses of the qualitative and quantitative data revealed significant differences between supervisors’ and students’ perceptions of all types (i.e., affective, cognitive, and behavioral) of student engagement. Significant disciplinary variations were also observed in supervisors’ and students’ perceptions of negative affect, cognitive engagement, and behavioral engagement. Furthermore, disciplinary background and feedback role interacted to shape perceptions of student engagement. These findings have implications for improving student engagement with supervisory feedback.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"60 ","pages":"Article 100841"},"PeriodicalIF":3.9,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140644684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Volume 60 editorial 第 60 卷社论
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-04-01 DOI: 10.1016/j.asw.2024.100857
David Slomp , Martin East
{"title":"Volume 60 editorial","authors":"David Slomp ,&nbsp;Martin East","doi":"10.1016/j.asw.2024.100857","DOIUrl":"10.1016/j.asw.2024.100857","url":null,"abstract":"","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"60 ","pages":"Article 100857"},"PeriodicalIF":3.9,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141130397","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Linguistic factors affecting L1 language evaluation in argumentative essays of students aged 16 to 18 attending secondary education in Greece 影响希腊 16 至 18 岁中学生议论文中 L1 语言评价的语言因素
IF 3.9 1区 文学
Assessing Writing Pub Date : 2024-04-01 DOI: 10.1016/j.asw.2024.100844
Koskinas Emmanouil , Gavriilidou Zoe , Andras Christos , Angelos Markos
{"title":"Linguistic factors affecting L1 language evaluation in argumentative essays of students aged 16 to 18 attending secondary education in Greece","authors":"Koskinas Emmanouil ,&nbsp;Gavriilidou Zoe ,&nbsp;Andras Christos ,&nbsp;Angelos Markos","doi":"10.1016/j.asw.2024.100844","DOIUrl":"https://doi.org/10.1016/j.asw.2024.100844","url":null,"abstract":"<div><p>The purpose of this paper is to investigate linguistic factors affecting the evaluation of the argumentative essays in written tests taken by junior and senior students, aged 16 to 18, attending high schools in Greece. To achieve this, we analyzed textual characteristics and scoring of 265 juniors and seniors, graded by 15 different raters. To examine the contribution of linguistic parameters to the assessment, we developed an automated tool to record and evaluate students' lexical and syntactic features in the Greek language. The results revealed that the extensive use of nominal groups including an adjective and a noun and the utilization of both impersonal and passive syntax, as well as adverbs to a lesser extent, contribute the most to positive grading in language tests. Furthermore, we identified a correlation between language and the other criteria of the evaluation rubric, namely content and organization. The paper contributes to the discussion about objectivity in writing evaluation in the Greek setting and to the creation of a rubric that ensures a more effective assessment of writing tasks.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"60 ","pages":"Article 100844"},"PeriodicalIF":3.9,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140822444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信