Isa Steinmann, Daniel Sánchez, Saskia van Laar, J. Braeken
{"title":"The impact of inconsistent responders to mixed-worded scales on inferences in international large-scale assessments","authors":"Isa Steinmann, Daniel Sánchez, Saskia van Laar, J. Braeken","doi":"10.1080/0969594X.2021.2005302","DOIUrl":null,"url":null,"abstract":"ABSTRACT Questionnaire scales that are mixed-worded, i.e. include both positively and negatively worded items, often suffer from issues like low reliability and more complex latent structures than intended. Part of the problem might be that some responders fail to respond consistently to the mixed-worded items. We investigated the prevalence and impact of inconsistent responders in 37 primary education systems participating in the joint PIRLS/TIMSS 2011 assessment. Using the mean absolute difference method and three mixed-worded self-concept scales, we identified between 2%‒36% of students as inconsistent responders across education systems. Consistent with expectations, these students showed lower average achievement scores and had a higher risk of being identified as inconsistent on more than one scale. We also found that the inconsistent responders biased the estimated dimensionality and reliability of the scales. The impact on external validity measures was limited and unsystematic. We discuss implications for the use and development of questionnaire scales.","PeriodicalId":51515,"journal":{"name":"Assessment in Education-Principles Policy & Practice","volume":"16 1","pages":"5 - 26"},"PeriodicalIF":2.7000,"publicationDate":"2021-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessment in Education-Principles Policy & Practice","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/0969594X.2021.2005302","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 3
Abstract
ABSTRACT Questionnaire scales that are mixed-worded, i.e. include both positively and negatively worded items, often suffer from issues like low reliability and more complex latent structures than intended. Part of the problem might be that some responders fail to respond consistently to the mixed-worded items. We investigated the prevalence and impact of inconsistent responders in 37 primary education systems participating in the joint PIRLS/TIMSS 2011 assessment. Using the mean absolute difference method and three mixed-worded self-concept scales, we identified between 2%‒36% of students as inconsistent responders across education systems. Consistent with expectations, these students showed lower average achievement scores and had a higher risk of being identified as inconsistent on more than one scale. We also found that the inconsistent responders biased the estimated dimensionality and reliability of the scales. The impact on external validity measures was limited and unsystematic. We discuss implications for the use and development of questionnaire scales.
期刊介绍:
Recent decades have witnessed significant developments in the field of educational assessment. New approaches to the assessment of student achievement have been complemented by the increasing prominence of educational assessment as a policy issue. In particular, there has been a growth of interest in modes of assessment that promote, as well as measure, standards and quality. These have profound implications for individual learners, institutions and the educational system itself. Assessment in Education provides a focus for scholarly output in the field of assessment. The journal is explicitly international in focus and encourages contributions from a wide range of assessment systems and cultures. The journal''s intention is to explore both commonalities and differences in policy and practice.