Christina Synowiec, Erin Fletcher, Luke Heinkel, Taylor Salisbury
{"title":"Getting Rigor Right: A Framework for Methodological Choice in Adaptive Monitoring and Evaluation.","authors":"Christina Synowiec, Erin Fletcher, Luke Heinkel, Taylor Salisbury","doi":"10.9745/GHSP-D-22-00243","DOIUrl":null,"url":null,"abstract":"<p><p>The field of global development has embraced the idea that programs require agile, adaptive approaches to monitoring, evaluation, and learning. But considerable debate still exists around which methods are most appropriate for adaptive learning. Researchers have a range of proven and novel tools to promote a culture of adaptation and learning. These tools include lean testing, rapid prototyping, formative research, and structured experimentation, all of which can be utilized to generate responsive feedback (RF) to improve social change programs. With such an extensive toolkit, how should one decide which methods to employ? In our experience, the level of rigor used should be responsive to the team's level of certainty about the program design being investigated-how certain-or confident-are we that a program design will produce its intended results? With less certainty, less rigor is needed; with more certainty, more rigor is needed. In this article, we present a framework for getting rigor right and illustrate its use in 3 case studies. For each example, we describe the feedback methods used and why, how the approach was implemented (including how we conducted cocreation and ensured buy-in), and the results of each engagement. We conclude with lessons learned from these examples and how to use the right kind of RF mechanism to improve social change programs.</p>","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":"1 1","pages":""},"PeriodicalIF":4.6000,"publicationDate":"2023-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10727457/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.9745/GHSP-D-22-00243","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0
Abstract
The field of global development has embraced the idea that programs require agile, adaptive approaches to monitoring, evaluation, and learning. But considerable debate still exists around which methods are most appropriate for adaptive learning. Researchers have a range of proven and novel tools to promote a culture of adaptation and learning. These tools include lean testing, rapid prototyping, formative research, and structured experimentation, all of which can be utilized to generate responsive feedback (RF) to improve social change programs. With such an extensive toolkit, how should one decide which methods to employ? In our experience, the level of rigor used should be responsive to the team's level of certainty about the program design being investigated-how certain-or confident-are we that a program design will produce its intended results? With less certainty, less rigor is needed; with more certainty, more rigor is needed. In this article, we present a framework for getting rigor right and illustrate its use in 3 case studies. For each example, we describe the feedback methods used and why, how the approach was implemented (including how we conducted cocreation and ensured buy-in), and the results of each engagement. We conclude with lessons learned from these examples and how to use the right kind of RF mechanism to improve social change programs.