{"title":"Closed formula of test length required for adaptive testing with medium probability of solution","authors":"Judit T. Kárász, K. Széll, Szabolcs Takács","doi":"10.1108/qae-03-2023-0042","DOIUrl":null,"url":null,"abstract":"\nPurpose\nBased on the general formula, which depends on the length and difficulty of the test, the number of respondents and the number of ability levels, this study aims to provide a closed formula for the adaptive tests with medium difficulty (probability of solution is p = 1/2) to determine the accuracy of the parameters for each item and in the case of calibrated items, determine the required test length given number of respondents.\n\n\nDesign/methodology/approach\nEmpirical results have been obtained on computerized or multistage adaptive implementation. Simulation studies and classroom/experimental results show that adaptive tests can measure test subjects’ ability to the same quality over half the test length compared to linear versions. Due to the complexity of the problem, the authors discuss a closed mathematical formula: the relationship between the length of the tests, the difficulty of solving the items, the number of respondents and the levels of ability.\n\n\nFindings\nThe authors present a closed formula that provides a lower bound for the minimum test length in the case of adaptive tests. The authors also present example calculations using the formula, based on the assessment framework of some student assessments to show the similarity between the theoretical calculations and the empirical results.\n\n\nOriginality/value\nWith this formula, we can form a connection between theoretical and simulation results.\n","PeriodicalId":46734,"journal":{"name":"QUALITY ASSURANCE IN EDUCATION","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2023-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"QUALITY ASSURANCE IN EDUCATION","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/qae-03-2023-0042","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Purpose
Based on the general formula, which depends on the length and difficulty of the test, the number of respondents and the number of ability levels, this study aims to provide a closed formula for the adaptive tests with medium difficulty (probability of solution is p = 1/2) to determine the accuracy of the parameters for each item and in the case of calibrated items, determine the required test length given number of respondents.
Design/methodology/approach
Empirical results have been obtained on computerized or multistage adaptive implementation. Simulation studies and classroom/experimental results show that adaptive tests can measure test subjects’ ability to the same quality over half the test length compared to linear versions. Due to the complexity of the problem, the authors discuss a closed mathematical formula: the relationship between the length of the tests, the difficulty of solving the items, the number of respondents and the levels of ability.
Findings
The authors present a closed formula that provides a lower bound for the minimum test length in the case of adaptive tests. The authors also present example calculations using the formula, based on the assessment framework of some student assessments to show the similarity between the theoretical calculations and the empirical results.
Originality/value
With this formula, we can form a connection between theoretical and simulation results.
期刊介绍:
QAE publishes original empirical or theoretical articles on Quality Assurance issues, including dimensions and indicators of Quality and Quality Improvement, as applicable to education at all levels, including pre-primary, primary, secondary, higher and professional education. Periodically, QAE also publishes systematic reviews, research syntheses and assessment policy articles on topics of current significance. As an international journal, QAE seeks submissions on topics that have global relevance. Article submissions could pertain to the following areas integral to QAE''s mission: -organizational or program development, change and improvement -educational testing or assessment programs -evaluation of educational innovations, programs and projects -school efficiency assessments -standards, reforms, accountability, accreditation, and audits in education -tools, criteria and methods for examining or assuring quality