Debbie L. Hahs-Vaughn, Christine Depies DeStefano, Christopher D. Charles, Mary Little
{"title":"Challenges and Adjustments in a Multisite School-Based Randomized Field Trial","authors":"Debbie L. Hahs-Vaughn, Christine Depies DeStefano, Christopher D. Charles, Mary Little","doi":"10.1177/10982140241236390","DOIUrl":null,"url":null,"abstract":"Randomized experiments are a strong design for establishing impact evidence because the random assignment mechanism theoretically allows confidence in attributing group differences to the intervention. Growth of randomized experiments within educational studies has been widely documented. However, randomized experiments within education have received criticism for implementation challenges and for ignoring context. Additionally, limited guidance exists for programs that are tasked with both implementation and evaluation within the same funding period. This study draws on a research team's experiences and examines opportunities and challenges in conducting a multisite randomized evaluation of an internship program for teacher candidates. We discuss how problems were collaboratively addressed and adjusted to align with local realities and demonstrate how the research team, in consultation with local stakeholders, addressed methodological and program implementation problems in the field. Recommendations for future research are provided.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.1000,"publicationDate":"2024-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Journal of Evaluation","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/10982140241236390","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Randomized experiments are a strong design for establishing impact evidence because the random assignment mechanism theoretically allows confidence in attributing group differences to the intervention. Growth of randomized experiments within educational studies has been widely documented. However, randomized experiments within education have received criticism for implementation challenges and for ignoring context. Additionally, limited guidance exists for programs that are tasked with both implementation and evaluation within the same funding period. This study draws on a research team's experiences and examines opportunities and challenges in conducting a multisite randomized evaluation of an internship program for teacher candidates. We discuss how problems were collaboratively addressed and adjusted to align with local realities and demonstrate how the research team, in consultation with local stakeholders, addressed methodological and program implementation problems in the field. Recommendations for future research are provided.
期刊介绍:
The American Journal of Evaluation (AJE) publishes original papers about the methods, theory, practice, and findings of evaluation. The general goal of AJE is to present the best work in and about evaluation, in order to improve the knowledge base and practice of its readers. Because the field of evaluation is diverse, with different intellectual traditions, approaches to practice, and domains of application, the papers published in AJE will reflect this diversity. Nevertheless, preference is given to papers that are likely to be of interest to a wide range of evaluators and that are written to be accessible to most readers.