Effects of a genre and topic knowledge activation device on a standardized writing test performance

IF 4.2 1区 文学 Q1 EDUCATION & EDUCATIONAL RESEARCH
Natalia Ávila Reyes , Diego Carrasco , Rosario Escribano , María Jesús Espinosa , Javiera Figueroa , Carolina Castillo
{"title":"Effects of a genre and topic knowledge activation device on a standardized writing test performance","authors":"Natalia Ávila Reyes ,&nbsp;Diego Carrasco ,&nbsp;Rosario Escribano ,&nbsp;María Jesús Espinosa ,&nbsp;Javiera Figueroa ,&nbsp;Carolina Castillo","doi":"10.1016/j.asw.2024.100898","DOIUrl":null,"url":null,"abstract":"<div><div>The aim of this article was twofold: first, to introduce a design for a writing test intended for application in large-scale assessments of writing, and second, to experimentally examine the effects of employing a device for activating prior knowledge of topic and genre as a means of controlling construct-irrelevant variance and enhancing validity. An authentic, situated writing task was devised, offering students a communicative purpose and a defined audience. Two devices were utilized for the cognitive activation of topic and genre knowledge: an infographic and a genre model. The participants in this study were 162 fifth-grade students from Santiago de Chile, with 78 students assigned to the experimental condition (with activation device) and 84 students assigned to the control condition (without activation device). The results demonstrate that the odds of presenting good writing ability are higher for students who were part of the experimental group, even when controlling for text transcription ability, considered a predictor of writing. These findings hold implications for the development of large-scale tests of writing guided by principles of educational and social justice.</div></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"62 ","pages":"Article 100898"},"PeriodicalIF":4.2000,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293524000916","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

The aim of this article was twofold: first, to introduce a design for a writing test intended for application in large-scale assessments of writing, and second, to experimentally examine the effects of employing a device for activating prior knowledge of topic and genre as a means of controlling construct-irrelevant variance and enhancing validity. An authentic, situated writing task was devised, offering students a communicative purpose and a defined audience. Two devices were utilized for the cognitive activation of topic and genre knowledge: an infographic and a genre model. The participants in this study were 162 fifth-grade students from Santiago de Chile, with 78 students assigned to the experimental condition (with activation device) and 84 students assigned to the control condition (without activation device). The results demonstrate that the odds of presenting good writing ability are higher for students who were part of the experimental group, even when controlling for text transcription ability, considered a predictor of writing. These findings hold implications for the development of large-scale tests of writing guided by principles of educational and social justice.
体裁和主题知识激活装置对标准化写作测试成绩的影响
本文的目的有二:首先,介绍一种用于大规模写作评估的写作测试设计;其次,通过实验检验采用激活主题和体裁的先验知识作为控制与建构无关的变异和提高有效性的手段的效果。我们设计了一个真实的情景写作任务,为学生提供了一个交际目的和明确的受众。为激活主题和体裁知识的认知,我们使用了两种工具:信息图表和体裁模型。这项研究的参与者是来自智利圣地亚哥的 162 名五年级学生,其中 78 名学生被分配到实验条件下(使用激活装置),84 名学生被分配到对照条件下(不使用激活装置)。结果表明,即使控制了被视为写作预测因素的文字转录能力,实验组学生表现出良好写作能力的几率也更高。这些发现对在教育和社会公正原则指导下开发大规模写作测试具有重要意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Assessing Writing
Assessing Writing Multiple-
CiteScore
6.00
自引率
17.90%
发文量
67
期刊介绍: Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信