{"title":"Modeling Slipping Effects in a Large-Scale Assessment with Innovative Item Formats","authors":"Ismail Cuhadar, Salih Binici","doi":"10.1111/emip.12508","DOIUrl":null,"url":null,"abstract":"<p>This study employs the 4-parameter logistic item response theory model to account for the unexpected incorrect responses or slipping effects observed in a large-scale Algebra 1 End-of-Course assessment, including several innovative item formats. It investigates whether modeling the misfit at the upper asymptote has any practical impact on the parameter estimates. With a simulation study, it also investigates the amount of bias in the parameter estimates when the slipping effects are ignored. Findings from the empirical data indicate that the impact of ignoring slipping effects is negligible when the abilities are evaluated within the context of classification of students into performance levels; however, it is present toward the extreme ends of ability continuum within the context of individual abilities. Findings from the simulations reveal that when the proportion of items with the slipping effects is small (20%), ignoring misfit does not have practical importance; however, when the proportion of items with the slipping effects is moderate to large (50%–80%), the abilities are generally underestimated at both ends of ability scale. When an upper asymptote parameter was used for modeling the slipping effects, the items became easier and more discriminative in general than the model ignoring the slipping effects.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":null,"pages":null},"PeriodicalIF":2.7000,"publicationDate":"2022-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Measurement-Issues and Practice","FirstCategoryId":"95","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/emip.12508","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 1
Abstract
This study employs the 4-parameter logistic item response theory model to account for the unexpected incorrect responses or slipping effects observed in a large-scale Algebra 1 End-of-Course assessment, including several innovative item formats. It investigates whether modeling the misfit at the upper asymptote has any practical impact on the parameter estimates. With a simulation study, it also investigates the amount of bias in the parameter estimates when the slipping effects are ignored. Findings from the empirical data indicate that the impact of ignoring slipping effects is negligible when the abilities are evaluated within the context of classification of students into performance levels; however, it is present toward the extreme ends of ability continuum within the context of individual abilities. Findings from the simulations reveal that when the proportion of items with the slipping effects is small (20%), ignoring misfit does not have practical importance; however, when the proportion of items with the slipping effects is moderate to large (50%–80%), the abilities are generally underestimated at both ends of ability scale. When an upper asymptote parameter was used for modeling the slipping effects, the items became easier and more discriminative in general than the model ignoring the slipping effects.