项目格式在 PISA 2018 数学素养评估中的作用:跨国研究

IF 2.6 2区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH
{"title":"项目格式在 PISA 2018 数学素养评估中的作用:跨国研究","authors":"","doi":"10.1016/j.stueduc.2024.101401","DOIUrl":null,"url":null,"abstract":"<div><p>When construct-irrelevant sources affect item difficulty, validity of the assessment is compromised. Using responses of 260000 students from 71 countries to the Programme for International Student Assessment (PISA) 2018 mathematics assessment and cross-classified mixed effects models, we examined three validity concerns associated with the construct-irrelevant factor, item format: whether the format influenced item difficulty, whether item format’s impact on difficulty varied across countries, undermining PISA’s foundational goal of meaningful country comparisons, and whether item format effects differed between genders, affecting assessment fairness. Item format contributed to a substantial average of 12 % of variance in item difficulties. The effect of item format was non-uniform across countries, with 30 % of the variance in item difficulties being due to format in lower-performing countries, and 10 % in higher-performing countries, challenging the comparability of educational outcomes. The impact of gender on item format differences was minor. Implications for secondary research and assessment design are discussed.</p></div>","PeriodicalId":47539,"journal":{"name":"Studies in Educational Evaluation","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0191491X24000804/pdfft?md5=508c28bd0233e2005407a3aeaf5ccee2&pid=1-s2.0-S0191491X24000804-main.pdf","citationCount":"0","resultStr":"{\"title\":\"The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study\",\"authors\":\"\",\"doi\":\"10.1016/j.stueduc.2024.101401\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>When construct-irrelevant sources affect item difficulty, validity of the assessment is compromised. Using responses of 260000 students from 71 countries to the Programme for International Student Assessment (PISA) 2018 mathematics assessment and cross-classified mixed effects models, we examined three validity concerns associated with the construct-irrelevant factor, item format: whether the format influenced item difficulty, whether item format’s impact on difficulty varied across countries, undermining PISA’s foundational goal of meaningful country comparisons, and whether item format effects differed between genders, affecting assessment fairness. Item format contributed to a substantial average of 12 % of variance in item difficulties. The effect of item format was non-uniform across countries, with 30 % of the variance in item difficulties being due to format in lower-performing countries, and 10 % in higher-performing countries, challenging the comparability of educational outcomes. The impact of gender on item format differences was minor. Implications for secondary research and assessment design are discussed.</p></div>\",\"PeriodicalId\":47539,\"journal\":{\"name\":\"Studies in Educational Evaluation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S0191491X24000804/pdfft?md5=508c28bd0233e2005407a3aeaf5ccee2&pid=1-s2.0-S0191491X24000804-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Studies in Educational Evaluation\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0191491X24000804\",\"RegionNum\":2,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Studies in Educational Evaluation","FirstCategoryId":"95","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0191491X24000804","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

摘要

当与建构无关的因素影响项目难度时,评估的有效性就会受到损害。利用来自 71 个国家的 26 万名学生对 2018 年国际学生评估项目(PISA)数学评估的回答和交叉分类混合效应模型,我们研究了与建构无关因素--项目格式--相关的三个效度问题:格式是否影响项目难度;项目格式对难度的影响是否因国家而异,从而破坏了 PISA 进行有意义的国家比较的基本目标;项目格式的影响是否因性别而异,从而影响评估的公平性。题目格式平均占题目难度差异的 12%。项目格式对各国的影响并不均匀,在成绩较差的国家,项目难度差异的 30% 是由项目格式造成的,而在成绩较好的国家,则为 10%,这对教育成果的可比性提出了挑战。性别对项目格式差异的影响很小。讨论了二次研究和评估设计的意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study

When construct-irrelevant sources affect item difficulty, validity of the assessment is compromised. Using responses of 260000 students from 71 countries to the Programme for International Student Assessment (PISA) 2018 mathematics assessment and cross-classified mixed effects models, we examined three validity concerns associated with the construct-irrelevant factor, item format: whether the format influenced item difficulty, whether item format’s impact on difficulty varied across countries, undermining PISA’s foundational goal of meaningful country comparisons, and whether item format effects differed between genders, affecting assessment fairness. Item format contributed to a substantial average of 12 % of variance in item difficulties. The effect of item format was non-uniform across countries, with 30 % of the variance in item difficulties being due to format in lower-performing countries, and 10 % in higher-performing countries, challenging the comparability of educational outcomes. The impact of gender on item format differences was minor. Implications for secondary research and assessment design are discussed.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
6.90
自引率
6.50%
发文量
90
审稿时长
62 days
期刊介绍: Studies in Educational Evaluation publishes original reports of evaluation studies. Four types of articles are published by the journal: (a) Empirical evaluation studies representing evaluation practice in educational systems around the world; (b) Theoretical reflections and empirical studies related to issues involved in the evaluation of educational programs, educational institutions, educational personnel and student assessment; (c) Articles summarizing the state-of-the-art concerning specific topics in evaluation in general or in a particular country or group of countries; (d) Book reviews and brief abstracts of evaluation studies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信