{"title":"在 PISA 评估的即时多阶段自适应测试中利用反应时间进行项目选择","authors":"Xiuxiu Tang, Yi Zheng, Tong Wu, K. Hau, H. Chang","doi":"10.1111/jedm.12403","DOIUrl":null,"url":null,"abstract":"Multistage adaptive testing (MST) has been recently adopted for international large‐scale assessments such as Programme for International Student Assessment (PISA). MST offers improved measurement efficiency over traditional nonadaptive tests and improved practical convenience over single‐item‐adaptive computerized adaptive testing (CAT). As a third alternative adaptive test design to MST and CAT, Zheng and Chang proposed the “on‐the‐fly multistage adaptive testing” (OMST), which combines the benefits of MST and CAT and offsets their limitations. In this study, we adopted the OMST design while also incorporating response time (RT) in item selection. Via simulations emulating the PISA 2018 reading test, including using the real item attributes and replicating PISA 2018 reading test's MST design, we compared the performance of our OMST designs against the simulated MST design in (1) measurement accuracy of test takers’ ability, (2) test time efficiency and consistency, and (3) expected gains in precision by design. We also investigated the performance of OMST in item bank usage and constraints management. Results show great potential for the proposed RT‐incorporated OMST designs to be used for PISA and potentially other international large‐scale assessments.","PeriodicalId":47871,"journal":{"name":"Journal of Educational Measurement","volume":null,"pages":null},"PeriodicalIF":1.4000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Utilizing Response Time for Item Selection in On‐the‐Fly Multistage Adaptive Testing for PISA Assessment\",\"authors\":\"Xiuxiu Tang, Yi Zheng, Tong Wu, K. Hau, H. Chang\",\"doi\":\"10.1111/jedm.12403\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multistage adaptive testing (MST) has been recently adopted for international large‐scale assessments such as Programme for International Student Assessment (PISA). MST offers improved measurement efficiency over traditional nonadaptive tests and improved practical convenience over single‐item‐adaptive computerized adaptive testing (CAT). As a third alternative adaptive test design to MST and CAT, Zheng and Chang proposed the “on‐the‐fly multistage adaptive testing” (OMST), which combines the benefits of MST and CAT and offsets their limitations. In this study, we adopted the OMST design while also incorporating response time (RT) in item selection. Via simulations emulating the PISA 2018 reading test, including using the real item attributes and replicating PISA 2018 reading test's MST design, we compared the performance of our OMST designs against the simulated MST design in (1) measurement accuracy of test takers’ ability, (2) test time efficiency and consistency, and (3) expected gains in precision by design. We also investigated the performance of OMST in item bank usage and constraints management. Results show great potential for the proposed RT‐incorporated OMST designs to be used for PISA and potentially other international large‐scale assessments.\",\"PeriodicalId\":47871,\"journal\":{\"name\":\"Journal of Educational Measurement\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2024-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Educational Measurement\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1111/jedm.12403\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"PSYCHOLOGY, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Measurement","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1111/jedm.12403","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
Utilizing Response Time for Item Selection in On‐the‐Fly Multistage Adaptive Testing for PISA Assessment
Multistage adaptive testing (MST) has been recently adopted for international large‐scale assessments such as Programme for International Student Assessment (PISA). MST offers improved measurement efficiency over traditional nonadaptive tests and improved practical convenience over single‐item‐adaptive computerized adaptive testing (CAT). As a third alternative adaptive test design to MST and CAT, Zheng and Chang proposed the “on‐the‐fly multistage adaptive testing” (OMST), which combines the benefits of MST and CAT and offsets their limitations. In this study, we adopted the OMST design while also incorporating response time (RT) in item selection. Via simulations emulating the PISA 2018 reading test, including using the real item attributes and replicating PISA 2018 reading test's MST design, we compared the performance of our OMST designs against the simulated MST design in (1) measurement accuracy of test takers’ ability, (2) test time efficiency and consistency, and (3) expected gains in precision by design. We also investigated the performance of OMST in item bank usage and constraints management. Results show great potential for the proposed RT‐incorporated OMST designs to be used for PISA and potentially other international large‐scale assessments.
期刊介绍:
The Journal of Educational Measurement (JEM) publishes original measurement research, provides reviews of measurement publications, and reports on innovative measurement applications. The topics addressed will interest those concerned with the practice of measurement in field settings, as well as be of interest to measurement theorists. In addition to presenting new contributions to measurement theory and practice, JEM also serves as a vehicle for improving educational measurement applications in a variety of settings.