Samuel T. Savitz, Michelle A. Lampman, Shealeigh A. Inselman, Ruchita Dholakia, Vicki L. Hunt, Angela B. Mattson, Robert J. Stroebel, Pamela J. McCabe, Stephanie G. Witwer, Bijan J. Borah
{"title":"Overcoming challenges in real-world evidence generation: An example from an Adult Medical Care Coordination program","authors":"Samuel T. Savitz, Michelle A. Lampman, Shealeigh A. Inselman, Ruchita Dholakia, Vicki L. Hunt, Angela B. Mattson, Robert J. Stroebel, Pamela J. McCabe, Stephanie G. Witwer, Bijan J. Borah","doi":"10.1002/lrh2.10430","DOIUrl":null,"url":null,"abstract":"<p>The Adult Medical Care Coordination program (“the program”) was implemented at Mayo Clinic to promote patient self-management and improve 30-day unplanned readmission for patients with high risk for readmission after hospital discharge. This study aimed to evaluate the impact of the program compared to usual care using a pragmatic, stepped wedge cluster randomized trial (“stepped wedge trial”). However, several challenges arose including large differences between the study arms. Our goal is to describe the challenges and present lessons learned on how to overcome such challenges and generate evidence to support practice decisions. We describe the challenges encountered during the trial, the approach to addressing these challenges, and lessons learned for other learning health system researchers facing similar challenges. The trial experienced several challenges in implementation including several clinics dropping from the study and care disruptions due to COVID-19. Additionally, there were large differences in the patient population between the program and usual care arms. For example, the mean age was 76.8 for the program and 68.1 for usual care. Due to these differences, we adapted the methods using the propensity score matching approach that is traditionally applied to observational designs and adjusted for differences in observable characteristics. When conducting pragmatic research, researchers will encounter factors beyond their control that may introduce bias. The lessons learned include the need to weigh the tradeoffs of pragmatic design elements and the potential value of adaptive designs for pragmatic trials. Applying these lessons would promote the successful generation of evidence that informs practice decisions.</p>","PeriodicalId":43916,"journal":{"name":"Learning Health Systems","volume":"8 S1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/lrh2.10430","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Learning Health Systems","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/lrh2.10430","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"HEALTH POLICY & SERVICES","Score":null,"Total":0}
引用次数: 0
Abstract
The Adult Medical Care Coordination program (“the program”) was implemented at Mayo Clinic to promote patient self-management and improve 30-day unplanned readmission for patients with high risk for readmission after hospital discharge. This study aimed to evaluate the impact of the program compared to usual care using a pragmatic, stepped wedge cluster randomized trial (“stepped wedge trial”). However, several challenges arose including large differences between the study arms. Our goal is to describe the challenges and present lessons learned on how to overcome such challenges and generate evidence to support practice decisions. We describe the challenges encountered during the trial, the approach to addressing these challenges, and lessons learned for other learning health system researchers facing similar challenges. The trial experienced several challenges in implementation including several clinics dropping from the study and care disruptions due to COVID-19. Additionally, there were large differences in the patient population between the program and usual care arms. For example, the mean age was 76.8 for the program and 68.1 for usual care. Due to these differences, we adapted the methods using the propensity score matching approach that is traditionally applied to observational designs and adjusted for differences in observable characteristics. When conducting pragmatic research, researchers will encounter factors beyond their control that may introduce bias. The lessons learned include the need to weigh the tradeoffs of pragmatic design elements and the potential value of adaptive designs for pragmatic trials. Applying these lessons would promote the successful generation of evidence that informs practice decisions.