{"title":"A Framework for Addressing Instrumentation Biases When Using Observation Systems as Outcome Measures in Instructional Interventions","authors":"Mark White, Bridget Maher, Brian Rowan","doi":"10.1080/19345747.2022.2081275","DOIUrl":null,"url":null,"abstract":"Abstract Many educational interventions seek to change teachers’ instructional practice. Standards-based observation systems are a common and useful tool to understand these interventions’ impact, but the process of measuring instructional change with observation systems is highly complex. This paper introduces a framework for examining and understanding potential instrumentation biases that arise when evaluations use observation systems to understand instructional change. The framework systematizes two processes that all studies must undertake: (1) the process of operationalizing the construct of teaching quality, and (2) the process of data collection. A study that engages in these processes generates observation scores that capture their own raters’ perspectives on specific segments of instruction. These scores must be generalized to draw conclusions about the intended constructs and settings. Systematizing these two processes highlights the necessary steps of a validity argument supporting evaluation conclusions and the instrumentation biases that threaten such conclusions. The framework is illustrated with an example from our recent work, which sought to understand instructional change since the adoption of the Common Core State Standards (CCSS).","PeriodicalId":47260,"journal":{"name":"Journal of Research on Educational Effectiveness","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2022-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Research on Educational Effectiveness","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/19345747.2022.2081275","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 2
Abstract
Abstract Many educational interventions seek to change teachers’ instructional practice. Standards-based observation systems are a common and useful tool to understand these interventions’ impact, but the process of measuring instructional change with observation systems is highly complex. This paper introduces a framework for examining and understanding potential instrumentation biases that arise when evaluations use observation systems to understand instructional change. The framework systematizes two processes that all studies must undertake: (1) the process of operationalizing the construct of teaching quality, and (2) the process of data collection. A study that engages in these processes generates observation scores that capture their own raters’ perspectives on specific segments of instruction. These scores must be generalized to draw conclusions about the intended constructs and settings. Systematizing these two processes highlights the necessary steps of a validity argument supporting evaluation conclusions and the instrumentation biases that threaten such conclusions. The framework is illustrated with an example from our recent work, which sought to understand instructional change since the adoption of the Common Core State Standards (CCSS).
期刊介绍:
As the flagship publication for the Society for Research on Educational Effectiveness, the Journal of Research on Educational Effectiveness (JREE) publishes original articles from the multidisciplinary community of researchers who are committed to applying principles of scientific inquiry to the study of educational problems. Articles published in JREE should advance our knowledge of factors important for educational success and/or improve our ability to conduct further disciplined studies of pressing educational problems. JREE welcomes manuscripts that fit into one of the following categories: (1) intervention, evaluation, and policy studies; (2) theory, contexts, and mechanisms; and (3) methodological studies. The first category includes studies that focus on process and implementation and seek to demonstrate causal claims in educational research. The second category includes meta-analyses and syntheses, descriptive studies that illuminate educational conditions and contexts, and studies that rigorously investigate education processes and mechanism. The third category includes studies that advance our understanding of theoretical and technical features of measurement and research design and describe advances in data analysis and data modeling. To establish a stronger connection between scientific evidence and educational practice, studies submitted to JREE should focus on pressing problems found in classrooms and schools. Studies that help advance our understanding and demonstrate effectiveness related to challenges in reading, mathematics education, and science education are especially welcome as are studies related to cognitive functions, social processes, organizational factors, and cultural features that mediate and/or moderate critical educational outcomes. On occasion, invited responses to JREE articles and rejoinders to those responses will be included in an issue.