American Journal of Evaluation最新文献

筛选
英文 中文
Book Review: Utilization-focused Evaluation, 5th Edition 书评:以使用为重点的评估》第 5 版
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-06-01 DOI: 10.1177/10982140221122772
R. Miller
{"title":"Book Review: Utilization-focused Evaluation, 5th Edition","authors":"R. Miller","doi":"10.1177/10982140221122772","DOIUrl":"https://doi.org/10.1177/10982140221122772","url":null,"abstract":"","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139371440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Human Rights-Based Evaluation Approach for Inclusive Education 基于人权的全纳教育评估方法
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-05-28 DOI: 10.1177/10982140231153810
C. Johnstone, A. Hayes, Elisheva Cohen, Hayley Niad, George Laryea-Adjei, K. Letshabo, Adrian Shikwe, A. Agu
{"title":"A Human Rights-Based Evaluation Approach for Inclusive Education","authors":"C. Johnstone, A. Hayes, Elisheva Cohen, Hayley Niad, George Laryea-Adjei, K. Letshabo, Adrian Shikwe, A. Agu","doi":"10.1177/10982140231153810","DOIUrl":"https://doi.org/10.1177/10982140231153810","url":null,"abstract":"This article reports on ways in which United Nations human rights treaties can be used as a normative framework for evaluating program outcomes. In this article, we conceptualize a human rights-based approach to program evaluation and locate this approach within the broader evaluation literature. The article describes how a rights-based framework can be used as an aspirational set of indicators for program evaluations to promote activities that align with internationally agreed-upon human rights norms. We then describe a case study of the evaluation through which this method was developed, including its sampling design, methodology, and findings. The United Nations International Children’s Fund (UNICEF) inclusive education evaluation described highlighted the need for conceptual clarity around what inclusive education is, and the importance of contextualized innovation toward meeting the educational rights of children with disabilities. Human rights perspectives and evaluation designs can help create such clarity, but should also be used with care.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44337859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Book Review: Leading Change Through Evaluation: Improvement Science in Action by Kristen L. Rohanna Kristen L.Rohanna的书评:通过评估引领变革:行动中的改进科学
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-05-07 DOI: 10.1177/10982140231153376
Valerie Marshall
{"title":"Book Review: Leading Change Through Evaluation: Improvement Science in Action by Kristen L. Rohanna","authors":"Valerie Marshall","doi":"10.1177/10982140231153376","DOIUrl":"https://doi.org/10.1177/10982140231153376","url":null,"abstract":"","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49292859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Protocol to Assess Contextual Factors During Program Impact Evaluation: A Case Study of a STEM Gender Equity Intervention in Higher Education 在项目影响评估中评估情境因素的协议:高等教育中STEM性别平等干预的案例研究
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-05-02 DOI: 10.1177/10982140231152281
Suzanne Nobrega, Kasper Edwards, Mazen El Ghaziri, Lauren Giacobbe, Serena Rice, L. Punnett
{"title":"A Protocol to Assess Contextual Factors During Program Impact Evaluation: A Case Study of a STEM Gender Equity Intervention in Higher Education","authors":"Suzanne Nobrega, Kasper Edwards, Mazen El Ghaziri, Lauren Giacobbe, Serena Rice, L. Punnett","doi":"10.1177/10982140231152281","DOIUrl":"https://doi.org/10.1177/10982140231152281","url":null,"abstract":"Program evaluations that lack experimental design often fail to produce evidence of impact because there is no available control group. Theory-based evaluations can generate evidence of a program's causal effects if evaluators collect evidence along the theorized causal chain and identify possible competing causes. However, few methods are available for assessing competing causes in the program environment. Effect Modifier Assessment (EMA) is a method previously used in smaller-scale studies to assess possible competing causes of observed changes following an intervention. In our case study of a university gender equity intervention, EMA generated useful evidence of competing causes to augment program evaluation. Top-down administrative culture, poor experiences with hiring and promotion, and workload were identified as impeding forces that might have reduced program benefits. The EMA addresses a methodological gap in theory-based evaluation and might be useful in a variety of program settings.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44206689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
From the Co-Editors: Honoring the Past to Inform Current and Future Evaluation 来自共同编辑:尊重过去,为当前和未来的评估提供信息
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-04-27 DOI: 10.1177/10982140231169134
J. Hall, Laura R. Peck
{"title":"From the Co-Editors: Honoring the Past to Inform Current and Future Evaluation","authors":"J. Hall, Laura R. Peck","doi":"10.1177/10982140231169134","DOIUrl":"https://doi.org/10.1177/10982140231169134","url":null,"abstract":"Volume","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42584731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
“A Lot of It Really Does Come Down to Values”: An Empirical Study of the Values Advanced by Seasoned Evaluators “很多确实归结为价值观”:经验丰富的评估者提出的价值观的实证研究
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-04-26 DOI: 10.1177/10982140231153805
Rebecca M. Teasdale, Jennifer R. McNeilly, Maria Isabel Ramírez Garzón, J. Novak, Jennifer C. Greene
{"title":"“A Lot of It Really Does Come Down to Values”: An Empirical Study of the Values Advanced by Seasoned Evaluators","authors":"Rebecca M. Teasdale, Jennifer R. McNeilly, Maria Isabel Ramírez Garzón, J. Novak, Jennifer C. Greene","doi":"10.1177/10982140231153805","DOIUrl":"https://doi.org/10.1177/10982140231153805","url":null,"abstract":"This study challenges persistent misrepresentations of evaluation as a value-neutral inquiry process by presenting an empirical study that deepens understanding of evaluators’ values and how they “show up” in evaluation practice. Through semistructured interviews and inductive analysis, we examined the values advanced by a sample of eight experienced evaluators. We surfaced and examined 12 values, which we organized into five clusters, that shaped the constitutive elements of the studies these evaluators conducted and guided how the evaluators positioned their work. Our findings provide empirical evidence about the role of values in evaluation practice and can support evaluators in reflecting on their own values and enacting their professional and ethical responsibilities to identify and articulate their values in the context of evaluation practice.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46490295","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Social Ontology and Evaluation—A Comment on “Framing Evaluation in Reality: An Introduction to Ontologically Integrative Evaluation” 社会本体论与评价——评《现实中的框架评价——本体论综合评价导论》
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-03-01 DOI: 10.1177/10982140221134779
R. Picciotto
{"title":"Social Ontology and Evaluation—A Comment on “Framing Evaluation in Reality: An Introduction to Ontologically Integrative Evaluation”","authors":"R. Picciotto","doi":"10.1177/10982140221134779","DOIUrl":"https://doi.org/10.1177/10982140221134779","url":null,"abstract":"According to Jennifer Billman, western evaluation bias against indigenous thinking is due to ontological incompetence. If so, the solution she offers (a highly abstract list of criteria) is inadequate since it fails to address let alone resolve a wide range of philosophical dilemmas at the intersection of logic and ontology. Furthermore, it fails to “frame evaluation in reality” since it ignores the patent fact that, in the market society, positivist evaluators dominate. They are value free, embrace a “clockwork” conception of the natural and social world, and do not question decision makers' goals. By contrast, constructivist evaluators recognize that social facts differ from natural facts since they are socially constructed and clustered within institutions that define roles, norms and expectations. It follows that constructivist evaluation holds the key to the problem identified by Billman since it resists capture by vested interests, gives pride of place to the relational context and embraces the validity of indigenous thinking.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41432469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Section Editor's Note: Using Power Insights to Better Plan Experiments 章节编者按:使用Power Insights更好地规划实验
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-03-01 DOI: 10.1177/10982140231154695
Laura R. Peck
{"title":"Section Editor's Note: Using Power Insights to Better Plan Experiments","authors":"Laura R. Peck","doi":"10.1177/10982140231154695","DOIUrl":"https://doi.org/10.1177/10982140231154695","url":null,"abstract":"How many people need to be in my evaluation in order to be able to detect a policyor programrelevant impact? If the program being evaluated is assigned to participants at an aggregate “cluster” or group level—such as classrooms filled with students—how many of those groups do I need? How many participants within each group? What if I am interested in subgroup effects; how many people or groups do I need then? Answers to the questions are essential for smart planning of experimental evaluations and are the motivation for this Experimental Methodology Section. Before I summarize the contributions of this Section’s three articles, let me first define some key concepts and explain what I see to be the main issues for this piece of experimental evaluation work. To begin, statistical “power” refers to an evaluation’s ability to detect an effect that is statistically significant; and minimum detectable effects (MDEs) are the smallest estimated effect that a given design can detect as statistically significant. Ultimately, the effect size is what a given evaluation is designed to estimate, and the evaluator will have to determine (1) what sample design and size is needed to detect that effect, or (2) what MDE is feasible, given budget and sample design and size realities. Several interrelated factors influence a study’s MDE, including (as drawn partly from Peck, 2020, Appendix Box A.1) the choices and realities of statistical significance threshold, statistical power, variance of the impact estimate, the level and variability of the outcome measure, and the clustered nature of the data, as elaborated next. Statistical significance threshold. The statistical significance level is the probability of identifying a false positive result (also referred to as Type I error). The MDE becomes larger as the statistical significance level decreases. All else equal, an impact must be larger to be detected with a statistical significance threshold of 1% than with a statistical significance threshold of 10%. Substantial debate in statistics and related fields focuses on “the p-value” and its value to establishing evidence (e.g., Wasserstein & Lazar, 2016). Statistical power. The statistical power is equal to the probability of correctly rejecting the null hypothesis (or, one minus the probability of a false negative result, or Type II error). In other words, power relates to the analyst’s ability to detect an impact that is statistically significant, should it exist. Statistical power is typically set to 80%, although other values may be reasonable too. Missing the detection of a favorable impact (Type II error) has lower up-front cost implications for the study, relative to falsely claiming that a favorable impact exists (Type I error). That said, an insufficiently powered study might lead to not generating new information (or, worse, to incorrect null findings), an ill-funded investment.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44262632","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
What to Expect When You Don’t Know What You are Expecting: Vigilance and the Monitoring and Evaluation of an Uncertain World 当你不知道自己在期待什么时该期待什么:警惕以及对不确定世界的监测和评估
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-03-01 DOI: 10.1177/10982140221079639
R. Goble, Edward R. Carr, Jon Anderson
{"title":"What to Expect When You Don’t Know What You are Expecting: Vigilance and the Monitoring and Evaluation of an Uncertain World","authors":"R. Goble, Edward R. Carr, Jon Anderson","doi":"10.1177/10982140221079639","DOIUrl":"https://doi.org/10.1177/10982140221079639","url":null,"abstract":"Complexity and uncertainty are long-standing challenges for global development projects. Coping with both requires flexibility and adaptation, the ability to identify unexpected circumstances, seize opportunities, and respond to threats. Vigilance is critical; it resides within the domains of monitoring, evaluation, and learning. In practice, maintaining vigilance is difficult, partly because effective vigilance has a dual nature. Normal, Type 1 vigilance, is anchored in knowing what to look for. It demands focus and attention to designated indicators. Type 2 vigilance looks for what project preparations failed to anticipate. It demands defocusing and openness; it sits outside contemporary design of monitoring and evaluation as it must question the assumptions in project design and implementation. We consider the role of both types of vigilance in global development and difficulties in maintaining both simultaneously. We identify pathways for improving the practice of vigilance and suggest practical steps in a template for pilot efforts.","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48669010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
From the Co-Editors: There's Always Room for Improvement: Building Better Practices and Methods for a Brighter Future 来自联合编辑:总有改进的空间:为更光明的未来构建更好的实践和方法
IF 1.7 3区 社会学
American Journal of Evaluation Pub Date : 2023-03-01 DOI: 10.1177/10982140231154683
J. Hall, Laura R. Peck
{"title":"From the Co-Editors: There's Always Room for Improvement: Building Better Practices and Methods for a Brighter Future","authors":"J. Hall, Laura R. Peck","doi":"10.1177/10982140231154683","DOIUrl":"https://doi.org/10.1177/10982140231154683","url":null,"abstract":"Since becoming evaluators, we have observed how the fi eld of evaluation has grown and changed. Major areas of development we have witnessed include increased attention to evaluation capacity-building initiatives, diversity, equity, and inclusion efforts, as well as demands for more adaptive evaluative strategies and techniques for improving the quality of evaluation planning and resulting evidence. Many of these areas of development in evaluation practice are in response to increased national and global complexity and uncertainty. Although the fi eld has evolved in response to these challenges, we recognize that there is always room for improvement. We anticipate ongoing complexity and uncertainty as contemporary political, social, economic, and environmental shifts take place in our world. As such, we desire to push the fi eld toward a more inclusive, adaptive, restor-ative, and effective evaluation praxis. This desire led us to assemble evaluation scholarship for this fi rst issue of volume 44 in the form of fi ve articles, a commentary, and a section on experimental methodology, including three articles. Separately, the articles in this issue extend the fi eld of evaluation ’ s development in the areas of evaluation capacity building (ECB), responsive and equity-oriented efforts, vigilant evaluation practice, and effective methodology. Collectively, the articles address the growing complexity of our world, providing insights and techniques to build better practices and methods for a brighter future. In the fi rst article, Gregory Phillips II, Dylan Felt, Esrea Perez-Bill, Megan M. Ruprecht, Erik Elías Glenn, Peter Lindeman, and Robin Lin Miller propose an evaluation orientation that is responsive to the LGBTQ + community ’ s interests and needs. They abbreviate into LGBTQ + individuals who identify as lesbian, gay, bisexual, transgender, queer, intersex, and Two-Spirit, inclusively along with other sexual and gender minorities; and they consider the intersectionality of these identity traits with those who are “ also Black, Indigenous, and People of Color (BIPOC), those who are dis-abled, and those who are working-class, poor, and otherwise economically disadvantaged, among","PeriodicalId":51449,"journal":{"name":"American Journal of Evaluation","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45835158","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信