{"title":"Comments","authors":"R. Boruch, M. Vinovskis","doi":"10.1353/pep.2005.0003","DOIUrl":null,"url":null,"abstract":"Recent dissatisfaction with public education in the United States has been matched by dismay with the current state of education research. A common complaint is that education research is good at description and hypothesis generation but not at answering causal questions about the effects of education policies on student outcomes.1 In this vein, many policymakers have expressed frustration that, as Ellen Condliffe Lagemann has noted, “education research has not yielded dramatic improvements in practice of the kind one can point to in medicine.”2 Such dissatisfaction has contributed to a number of recent federal policy changes intended to improve the quality of research in education, including the creation of a new Institute of Education Sciences (IES) to support increased experimentation within education and an emphasis on the use of teaching methods supported by “scientifically-based research” in the 2001 No Child Left Behind Act (NCLB). In this paper we consider the possible effects of these recent changes on the state of education research. We focus on what might be termed program or policy evaluation—research that aims to support causal inferences about the efficacy of specific educational programs or policies. Examples include studies that examine whether smaller class size improves student achievement, whether a particular reading curriculum leads to increased reading comprehension, and whether “pull-out” programs are more effective than “push-in” programs for students with learning disabilities. It is important to note that a great deal of research in education does not aim to answer these types of questions but rather","PeriodicalId":9272,"journal":{"name":"Brookings Papers on Education Policy","volume":"27 1","pages":"67 - 80"},"PeriodicalIF":0.0000,"publicationDate":"2005-02-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Brookings Papers on Education Policy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1353/pep.2005.0003","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recent dissatisfaction with public education in the United States has been matched by dismay with the current state of education research. A common complaint is that education research is good at description and hypothesis generation but not at answering causal questions about the effects of education policies on student outcomes.1 In this vein, many policymakers have expressed frustration that, as Ellen Condliffe Lagemann has noted, “education research has not yielded dramatic improvements in practice of the kind one can point to in medicine.”2 Such dissatisfaction has contributed to a number of recent federal policy changes intended to improve the quality of research in education, including the creation of a new Institute of Education Sciences (IES) to support increased experimentation within education and an emphasis on the use of teaching methods supported by “scientifically-based research” in the 2001 No Child Left Behind Act (NCLB). In this paper we consider the possible effects of these recent changes on the state of education research. We focus on what might be termed program or policy evaluation—research that aims to support causal inferences about the efficacy of specific educational programs or policies. Examples include studies that examine whether smaller class size improves student achievement, whether a particular reading curriculum leads to increased reading comprehension, and whether “pull-out” programs are more effective than “push-in” programs for students with learning disabilities. It is important to note that a great deal of research in education does not aim to answer these types of questions but rather