{"title":"社区大学个体水平随机对照试验样本量规划的设计参数。","authors":"Marie-Andrée Somers, Michael J Weiss, Colin Hill","doi":"10.1177/0193841X221121236","DOIUrl":null,"url":null,"abstract":"<p><p>The last two decades have seen a dramatic increase in randomized controlled trials (RCTs) conducted in community colleges. Yet, there is limited empirical information on the design parameters necessary to plan the sample size for RCTs in this context. For a blocked student-level random assignment research design, key design parameters for the minimum detectable true effect (MDTE) are the within-block outcome standard deviation <math><mrow><mo>(</mo><msub><mi>σ</mi><mrow><mo>|</mo><mi>S</mi></mrow></msub><mo>)</mo></mrow></math> and the within-block outcome variance explained by baseline covariates like student characteristics <math><mrow><mo>(</mo><msubsup><mi>R</mi><mrow><mo>|</mo><mi>S</mi></mrow><mn>2</mn></msubsup><mo>)</mo></mrow></math>. We provide empirical estimates of these key design parameters, discussing the pattern of estimates across outcomes (enrollment, credits earned, credential attainment, and grade point average), semesters, and studies. The main analyses use student-level data from 8 to 14 RCTs including 5,649-7,099 students (depending on the outcome) with follow-up data for 3 years. The following patterns are observed: the within-block standard deviation <math><mrow><mo>(</mo><msub><mi>σ</mi><mrow><mo>|</mo><mi>S</mi></mrow></msub><mo>)</mo></mrow></math> and therefore the MDTE can be much larger in later semesters for enrollment outcomes and cumulative credits earned; there is substantial variation across studies in <math><mrow><msub><mi>σ</mi><mrow><mo>|</mo><mi>S</mi></mrow></msub></mrow></math> for degree attainment; and baseline covariates explain less than 10% of the variation in student outcomes. These findings indicate that when planning the sample size for a study, researchers should be mindful of the follow-up period, use a range of values to calculate the MDTE for outcomes that vary across studies, and assume a value of <math><mrow><msubsup><mi>R</mi><mrow><mo>|</mo><mi>S</mi></mrow><mn>2</mn></msubsup></mrow></math> between 0 and 0.05. A public database created for this paper includes parameter estimates for additional RCTs and students.</p>","PeriodicalId":47533,"journal":{"name":"Evaluation Review","volume":"47 4","pages":"599-629"},"PeriodicalIF":3.0000,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/ee/45/10.1177_0193841X221121236.PMC10278387.pdf","citationCount":"0","resultStr":"{\"title\":\"Design Parameters for Planning the Sample Size of Individual-Level Randomized Controlled Trials in Community Colleges.\",\"authors\":\"Marie-Andrée Somers, Michael J Weiss, Colin Hill\",\"doi\":\"10.1177/0193841X221121236\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The last two decades have seen a dramatic increase in randomized controlled trials (RCTs) conducted in community colleges. Yet, there is limited empirical information on the design parameters necessary to plan the sample size for RCTs in this context. For a blocked student-level random assignment research design, key design parameters for the minimum detectable true effect (MDTE) are the within-block outcome standard deviation <math><mrow><mo>(</mo><msub><mi>σ</mi><mrow><mo>|</mo><mi>S</mi></mrow></msub><mo>)</mo></mrow></math> and the within-block outcome variance explained by baseline covariates like student characteristics <math><mrow><mo>(</mo><msubsup><mi>R</mi><mrow><mo>|</mo><mi>S</mi></mrow><mn>2</mn></msubsup><mo>)</mo></mrow></math>. We provide empirical estimates of these key design parameters, discussing the pattern of estimates across outcomes (enrollment, credits earned, credential attainment, and grade point average), semesters, and studies. The main analyses use student-level data from 8 to 14 RCTs including 5,649-7,099 students (depending on the outcome) with follow-up data for 3 years. The following patterns are observed: the within-block standard deviation <math><mrow><mo>(</mo><msub><mi>σ</mi><mrow><mo>|</mo><mi>S</mi></mrow></msub><mo>)</mo></mrow></math> and therefore the MDTE can be much larger in later semesters for enrollment outcomes and cumulative credits earned; there is substantial variation across studies in <math><mrow><msub><mi>σ</mi><mrow><mo>|</mo><mi>S</mi></mrow></msub></mrow></math> for degree attainment; and baseline covariates explain less than 10% of the variation in student outcomes. These findings indicate that when planning the sample size for a study, researchers should be mindful of the follow-up period, use a range of values to calculate the MDTE for outcomes that vary across studies, and assume a value of <math><mrow><msubsup><mi>R</mi><mrow><mo>|</mo><mi>S</mi></mrow><mn>2</mn></msubsup></mrow></math> between 0 and 0.05. A public database created for this paper includes parameter estimates for additional RCTs and students.</p>\",\"PeriodicalId\":47533,\"journal\":{\"name\":\"Evaluation Review\",\"volume\":\"47 4\",\"pages\":\"599-629\"},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2023-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/ee/45/10.1177_0193841X221121236.PMC10278387.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Evaluation Review\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.1177/0193841X221121236\",\"RegionNum\":4,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"SOCIAL SCIENCES, INTERDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Evaluation Review","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/0193841X221121236","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
Design Parameters for Planning the Sample Size of Individual-Level Randomized Controlled Trials in Community Colleges.
The last two decades have seen a dramatic increase in randomized controlled trials (RCTs) conducted in community colleges. Yet, there is limited empirical information on the design parameters necessary to plan the sample size for RCTs in this context. For a blocked student-level random assignment research design, key design parameters for the minimum detectable true effect (MDTE) are the within-block outcome standard deviation and the within-block outcome variance explained by baseline covariates like student characteristics . We provide empirical estimates of these key design parameters, discussing the pattern of estimates across outcomes (enrollment, credits earned, credential attainment, and grade point average), semesters, and studies. The main analyses use student-level data from 8 to 14 RCTs including 5,649-7,099 students (depending on the outcome) with follow-up data for 3 years. The following patterns are observed: the within-block standard deviation and therefore the MDTE can be much larger in later semesters for enrollment outcomes and cumulative credits earned; there is substantial variation across studies in for degree attainment; and baseline covariates explain less than 10% of the variation in student outcomes. These findings indicate that when planning the sample size for a study, researchers should be mindful of the follow-up period, use a range of values to calculate the MDTE for outcomes that vary across studies, and assume a value of between 0 and 0.05. A public database created for this paper includes parameter estimates for additional RCTs and students.
期刊介绍:
Evaluation Review is the forum for researchers, planners, and policy makers engaged in the development, implementation, and utilization of studies aimed at the betterment of the human condition. The Editors invite submission of papers reporting the findings of evaluation studies in such fields as child development, health, education, income security, manpower, mental health, criminal justice, and the physical and social environments. In addition, Evaluation Review will contain articles on methodological developments, discussions of the state of the art, and commentaries on issues related to the application of research results. Special features will include periodic review essays, "research briefs", and "craft reports".