评估来源使用:摘要与阅读写作议论文

IF 5.4 3区 材料科学 Q2 CHEMISTRY, PHYSICAL
Qin Xie
{"title":"评估来源使用:摘要与阅读写作议论文","authors":"Qin Xie","doi":"10.1016/j.asw.2023.100755","DOIUrl":null,"url":null,"abstract":"<div><p>What is involved in source use and how to assess it have been key concerns of research on L2 integrated writing assessment. However, raters’ ability to reliably assess the construct remains scarcely investigated, as do the relations among different types of integrated writing tasks. To partially address this gap, the present study had a sizeable sample (N = 204) of undergraduates from three Hong Kong universities write a summary and an integrated reading-to-write argumentative essay task in a test-like condition. Then, focusing on the criteria of source use, it analysed raters’ application of analytical rubrics in assessing the writing outputs. Rater variability and scale structures were examined through the Multi-Facet Rasch Measurement and compared across the two writing tasks. Both similarities and differences were found. In the summary task, the criteria for source use were applied similarly to the criteria for language use and discourse features. In the essay task, however, the application of the source use criteria was much less consistent. Diagnostic statistics indicate that fewer levels on the scale would be more advisable. For both tasks, the criterion of <em>source language use</em> was found not to fit the overall model nor to align with the criteria for source ideas or language use, indicating that this criterion may represent a trait different from the other. The statistical relations between source use and the other subconstructs of integrated writing tasks are also reported herein. Implications are discussed in the interest of refining the assessment of the source use construct in the future.</p></div>","PeriodicalId":4,"journal":{"name":"ACS Applied Energy Materials","volume":null,"pages":null},"PeriodicalIF":5.4000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Assessing source use: Summary vs. reading-to-write argumentative essay\",\"authors\":\"Qin Xie\",\"doi\":\"10.1016/j.asw.2023.100755\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>What is involved in source use and how to assess it have been key concerns of research on L2 integrated writing assessment. However, raters’ ability to reliably assess the construct remains scarcely investigated, as do the relations among different types of integrated writing tasks. To partially address this gap, the present study had a sizeable sample (N = 204) of undergraduates from three Hong Kong universities write a summary and an integrated reading-to-write argumentative essay task in a test-like condition. Then, focusing on the criteria of source use, it analysed raters’ application of analytical rubrics in assessing the writing outputs. Rater variability and scale structures were examined through the Multi-Facet Rasch Measurement and compared across the two writing tasks. Both similarities and differences were found. In the summary task, the criteria for source use were applied similarly to the criteria for language use and discourse features. In the essay task, however, the application of the source use criteria was much less consistent. Diagnostic statistics indicate that fewer levels on the scale would be more advisable. For both tasks, the criterion of <em>source language use</em> was found not to fit the overall model nor to align with the criteria for source ideas or language use, indicating that this criterion may represent a trait different from the other. The statistical relations between source use and the other subconstructs of integrated writing tasks are also reported herein. Implications are discussed in the interest of refining the assessment of the source use construct in the future.</p></div>\",\"PeriodicalId\":4,\"journal\":{\"name\":\"ACS Applied Energy Materials\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.4000,\"publicationDate\":\"2023-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Energy Materials\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1075293523000636\",\"RegionNum\":3,\"RegionCategory\":\"材料科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CHEMISTRY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Energy Materials","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293523000636","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

摘要

文章来源的使用涉及到什么以及如何对其进行评估一直是二语综合写作评估研究的关键问题。然而,评分者可靠地评估结构的能力仍然很少被调查,不同类型的综合写作任务之间的关系也是如此。为了部分解决这一差距,本研究有一个相当大的样本(N = 204),来自三所香港大学的本科生在类似测试的条件下写总结和综合阅读到写作的议论文任务。然后,着眼于来源使用的标准,分析了评分者在评估写作产出时对分析标准的应用。通过多面Rasch测量来检查评分变异性和量表结构,并在两个写作任务中进行比较。我们发现了相似点和不同点。在总结任务中,源使用的标准与语言使用和话语特征的标准类似。然而,在论文作业中,来源使用标准的应用就不那么一致了。诊断统计数据表明,更少的等级将是更可取的。对于这两个任务,源语言使用的标准被发现不适合整体模型,也不与源思想或语言使用的标准一致,这表明这个标准可能代表了不同于另一个的特征。本文还报道了信息源的使用与综合写作任务的其他子结构之间的统计关系。本文讨论的意义是为了在将来改进源使用结构的评估。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Assessing source use: Summary vs. reading-to-write argumentative essay

What is involved in source use and how to assess it have been key concerns of research on L2 integrated writing assessment. However, raters’ ability to reliably assess the construct remains scarcely investigated, as do the relations among different types of integrated writing tasks. To partially address this gap, the present study had a sizeable sample (N = 204) of undergraduates from three Hong Kong universities write a summary and an integrated reading-to-write argumentative essay task in a test-like condition. Then, focusing on the criteria of source use, it analysed raters’ application of analytical rubrics in assessing the writing outputs. Rater variability and scale structures were examined through the Multi-Facet Rasch Measurement and compared across the two writing tasks. Both similarities and differences were found. In the summary task, the criteria for source use were applied similarly to the criteria for language use and discourse features. In the essay task, however, the application of the source use criteria was much less consistent. Diagnostic statistics indicate that fewer levels on the scale would be more advisable. For both tasks, the criterion of source language use was found not to fit the overall model nor to align with the criteria for source ideas or language use, indicating that this criterion may represent a trait different from the other. The statistical relations between source use and the other subconstructs of integrated writing tasks are also reported herein. Implications are discussed in the interest of refining the assessment of the source use construct in the future.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACS Applied Energy Materials
ACS Applied Energy Materials Materials Science-Materials Chemistry
CiteScore
10.30
自引率
6.20%
发文量
1368
期刊介绍: ACS Applied Energy Materials is an interdisciplinary journal publishing original research covering all aspects of materials, engineering, chemistry, physics and biology relevant to energy conversion and storage. The journal is devoted to reports of new and original experimental and theoretical research of an applied nature that integrate knowledge in the areas of materials, engineering, physics, bioscience, and chemistry into important energy applications.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信