{"title":"Evaluating in a Fragmented Society","authors":"E. House","doi":"10.56645/jmde.v16i36.653","DOIUrl":"https://doi.org/10.56645/jmde.v16i36.653","url":null,"abstract":"Background: Over decades American society has become increasingly fragmented, distrusting, and unequal. Distrust and inequality interact with institutions performing improperly to weaken the society. \u0000Purpose: To suggest ways to strengthen evaluation’s role in a changing society \u0000Setting: Evaluation has entered a post normal phase where evaluations are losing credibility and effectiveness. \u0000Intervention: Analyze the changing society and suggest adjustments that evaluators might make. \u0000Research design: Collate and synthesize empirical studies about society and the implications for evaluators. \u0000Data collection and analysis: Collect and interpret seminal empirical economic, sociological, and political studies of beliefs and inequality in the United States. \u0000Findings: To strengthen the potency of evaluations of any type, evaluators could act as moral fiduciaries, practice transparency, cultivate cognitive empathy, focus on deep stories and deep values, and mitigate inequalities in the evaluation space. They can act as critics of evaluation practices inside and outside the evaluation space. They should avoid technical, social, and situational biases, including racism, sexism, and conflicts of interest, to increase the honesty and credibility of evaluations. They should not allow career concerns to prevent them from doing the right thing. These professional ethics and practices can be applied singly or collectively to most evaluation approaches to strengthen the evaluator’s role and address major societal problems. \u0000Keywords: moral fiduciary; cognitive empathy; post normal; inequality; transparency; distrust; deep stories; values","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41686152","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluating Public Participation in a Project Plan Review: A Nigerian Case Study","authors":"A. Badiora, A. Bako, D. O. Olaleye","doi":"10.56645/jmde.v16i36.633","DOIUrl":"https://doi.org/10.56645/jmde.v16i36.633","url":null,"abstract":"Background: Rooted in national and international laws regarding project planning and implementation is public participation. However, it is unclear whether public projects are enabling sufficient public input or are likely to be able to meet future management planning needs; particularly in developing countries. \u0000Purpose: We assessed people’s experiences when contributing to a public project decision-making in order to understand the strength, weaknesses, opportunities and threat to effective public participation. \u0000Setting: We conducted this assessment with a sample of people who contributed to a public project planning and review in a Nigerian city. \u0000Intervention: Not applicable. \u0000Research design: Appraisal criteria are based on the principles of public participation as laid down in the law and consists of the following elements: respondents’ profile, their involvement in the project; purpose of participation, availability of information, feedback mechanism and overall view of the participatory planning process. Information collected consists both quantitative and qualitative data and these were analysed using descriptive statistics and narrative techniques of reporting. \u0000Findings: Findings show that public participation was far below the minimum requirement of the law and not demographically representative. The most important reason respondents participated was to protect an interest in land, although some saw participation as a democratic right. Results show that attending public hearings was the commonest way of participation in a project review. Nevertheless, three-quarters of the respondents thought the final plan did not take their observations and advice into consideration. Respondents confirmed that the process was reasonably notified with opportunities for consultation meetings. Nevertheless, findings suggest some bias actions as significant proportions of respondents held absence of transparency and political interference flawed the project planning and review process. \u0000Keywords: stakeholder engagement; project evaluation; transparency; universal design; equality","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47012118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Opportunities and Challenges to Increase Inter- and Transdisciplinarity: A Qualitative Study of the FloodRISE Project","authors":"J. Forbat","doi":"10.56645/jmde.v16i35.625","DOIUrl":"https://doi.org/10.56645/jmde.v16i35.625","url":null,"abstract":"Background: The FloodRISE project, which started in 2013 in Southern California, aimed at better understanding how to promote resilience to coastal flooding. It was based on a cross-disciplinary approach, involving several research teams and local communities. \u0000Purpose: We conducted a qualitative study of the first phase of the project (2013-2015) in order to analyze its inter- and transdisciplinary aspects. \u0000Setting: We conducted this evaluation as a visiting postdoctoral researcher at UCI, not participating in the FloodRISE project. \u0000Intervention: Not applicable. \u0000Research design: We conducted 18 semi-structured interviews with members of the three project teams - modeling, social ecology and integration & impact - at UCI in 2015. Data were analyzed and interpreted to identify key aspects of the collaboration within and between project teams, as well as their relationship to local stakeholders. \u0000Findings: The analysis showed that an intensive dialogue-based method of interaction and the presence of boundary researchers played a fundamental role in bridging the conceptual and methodological gaps between social and engineering sciences. These results thus exemplify several possibilities for developing more efficient interactions between researchers in a cross-disciplinary project. However, any cross-disciplinary project should: carefully evaluate potential for participants to become boundary researchers, since participants with multiple disciplinary expertise may be underemployed; improve researchers’ level of readiness, in order to facilitate further interaction and increase time efficiency; and clearly address remoteness issues to avoid lower collaboration between central and peripheral locations. \u0000Keywords: interdisciplinarity; transdisciplinarity; qualitative study; project evaluation; flood risk","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46260301","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards a Complexity Framework for Transformative Evaluation","authors":"R. Picciotto","doi":"10.56645/jmde.v16i35.643","DOIUrl":"https://doi.org/10.56645/jmde.v16i35.643","url":null,"abstract":"Background: Complexity ideas originating in mathematics and the natural sciences have begun to inform evaluation practice. A new wave in evaluation history is about to break. A new mindset, new methods, and new evaluation processes are being summoned to explore and address the challenges of global pandemics, growing inequities, and existential environmental risks. This is part of a broader paradigm shift underway in science where interdisciplinarity has become the norm rather than the exception. \u0000Purpose: This article explores the utility of a complexity framework for a more effective evaluation function. It unearths the antecedents of complexity thinking; explores its relevance to evolving knowledge paradigms; provides a bird’s eye view of complexity concepts; uses the logic of complex adaptive systems to unpack the role of evaluation in society; and draws the implications of contemporary social challenges for evaluation policy directions. \u0000Setting: Not applicable. \u0000Intervention: Not applicable. \u0000Research design: Not applicable. \u0000Findings: The evaluation complexity challenge coincides with an urgent imperative: social transformation. The on-going pandemic has brought to light the disproportionate effects of health emergencies on disadvantaged groups and emphasized the urgency of improving the interface between humans and nature. It has also demonstrated the importance of modelling for policy making – as well as its limitations. Evaluation, a complex adaptive system, should be transformed to serve society. \u0000Keywords: complexity; computers; disciplines; emergence; modelling; paradigm, systems","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44873666","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Critical-Historical Review of Program Evaluation and the Emerging Motif ‘Evaluation Science’","authors":"W. J. Fear","doi":"10.56645/jmde.v16i35.627","DOIUrl":"https://doi.org/10.56645/jmde.v16i35.627","url":null,"abstract":"Background: It is important to distinguish between evaluation as an inherent, automatic, affective process and Program Evaluation (Evaluation, with capitalised ‘E’) as an institution, and equally important to consider what a good understanding of evaluation tells us about Evaluation. Evaluation is an established social institution whose modern roots can be traced back to 16th century France. Since the early 1900s the institution has developed within and across a range of scientific disciplines with interests in perceived social problems and efforts to resolve the said problems. This can be demonstrated objectively by the number and scale of relevant publications within relevant disciplines. This, in turn, helps us understand more about Evaluation as an institution. Set in this context is the question of Evaluation Science: is this simply a fashionable institutional motif or is it a potential new era for Evaluation? \u0000Purpose: Commentary on the history and development of Program Evaluation. \u0000Setting: Not applicable. \u0000Intervention: Not applicable. \u0000Research design: Not applicable. \u0000Data collection & analysis: Not applicable. \u0000Findings: Not applicable. \u0000Keywords: program evaluation; evaluation; history; evaluation science.","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49290572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Birth and Adaptation of Evaluation Theories","authors":"Marvin C. Alkin, M. Patton","doi":"10.56645/jmde.v16i35.637","DOIUrl":"https://doi.org/10.56645/jmde.v16i35.637","url":null,"abstract":"Background: Evaluation theories as we know them are prescriptions by prominent evaluators about what they believe to be an appropriate way to conduct evaluations. How do these prescriptions come about? In this paper we examine the various influences on the creation and subsequent modification of these prescribed evaluation theories. Inquiry into evaluation theories has a long history. What is new is inquiry into the evolution of theories.This makes theory formulation dynamic rather than static. Influences identified by Alkin in a National Society for the Study of Education yearbook (1989) serve as an initial guide to this inquiry. An examination of Michael Q. Patton's writings and shaping experiences provides further case study insights about the evolution of his utilization-focused evaluation theory and its offshoots. \u0000Purpose: The purpose of this paper is to gain further understanding about the way in which evaluation theories are developed, evolve, and take new directions, and the influences that shape the theorists' understandings and prescriptions. \u0000Setting: Interview discussion with Michael Q. Patton and synthesis of interview data. \u0000Intervention: Not applicable. \u0000Research design: Not applicable. \u0000Data collection & analysis: Not applicable. \u0000Findings: Factors that have influenced Michael Q. Patton’s initial theory development as well as subsequent modifications, adaptations, and offshoots offer insights into the connection between personal history and professional perspective. Specifically, these factors were: early personal experiences, professional training, interaction with professional colleagues, field evaluation experiences, interaction with non-evaluation academic colleagues and research conducted by Patton. \u0000Keywords: evaluation theory; theory; utilization-focused evaluation; developmental evaluation; principles-focused evaluation.","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43571478","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Goldie MacDonald, S. Mercer, Angela H Fisher, V. Grigorescu, W. M. Kenzie, M. Anderson
{"title":"Process Evaluation to Document Crucial Moments in Development of the National Neurological Conditions Surveillance System at the U.S. Centers for Disease Control and Prevention","authors":"Goldie MacDonald, S. Mercer, Angela H Fisher, V. Grigorescu, W. M. Kenzie, M. Anderson","doi":"10.56645/jmde.v16i35.635","DOIUrl":"https://doi.org/10.56645/jmde.v16i35.635","url":null,"abstract":"Background: Neurological conditions or disorders strike roughly 50 million Americans annually but accurate and comprehensive national estimates for many of these conditions are not available. In 2019, Congress provided $5 million to Centers for Disease Control and Prevention (CDC) to establish the National Neurological Conditions Surveillance System (NNCSS). CDC focused initial activities on multiple sclerosis and Parkinson’s disease. \u0000Purpose: We conducted a process evaluation to document and understand multifaceted work to implement a new surveillance activity for two neurological conditions. \u0000Setting: We conducted this evaluation with government personnel internal to the Center for Surveillance, Epidemiology, and Laboratory Services at the Centers for Disease Control and Prevention in Atlanta, GA. \u0000Intervention: A new public health surveillance activity for two neurological conditions, multiple sclerosis and Parkinson’s disease, that uses existing data resources and systems. \u0000Research design: The evaluation included interviews with CDC personnel and review of administrative and programmatic information. Data were analyzed and interpreted to identify crucial moments in the first year of funded work on NNCSS. The study revealed that this surveillance activity required diverse contributions and collaboration within the federal government and with non-governmental organizations. The findings can be used to guide work to enhance surveillance for many neurological conditions. \u0000Findings: The study revealed that this surveillance activity required diverse contributions and collaboration within the federal government and with non-governmental organizations. While collaboration is a cornerstone of public health practice, it is not always well-documented in planning or implementation of surveillance or other data-related activities. \u0000Keywords: program evaluation; surveillance; neurological conditions; neurological disorders; multiple sclerosis; Parkinson’s disease.","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43775446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Refining and Measuring the Construct of Evaluative Thinking: An Exploratory Factor Analysis of the Evaluative Thinking Inventory","authors":"Jason S. McIntosh, Jane Buckley, T. Archibald","doi":"10.56645/jmde.v16i34.591","DOIUrl":"https://doi.org/10.56645/jmde.v16i34.591","url":null,"abstract":"Background: Evaluative thinking has emerged as a key construct in evaluation, especially for evaluation practitioners and researchers interested in evaluation capacity building (ECB). Yet, despite increasing calls for more research on evaluation and, more specifically, for more research on ECB, until recently little empirical inquiry on the dimensions of evaluative thinking has been conducted. \u0000Purpose: To address that lack, the purpose of the study presented in this paper is to refine the construct of evaluative thinking by exploring its underlying dimensions and to ascertain the internal consistency of an instrument developed to measure evaluative thinking, the Evaluative Thinking Inventory (ETI). \u0000Setting: The ETI was developed as part of an ECB initiative focused on non-formal science, engineering, technology, and math (STEM) education in the United States, and was tested as part of a study focused on evaluating gifted education programs, also in the United States. \u0000Intervention: Not applicable. \u0000Research design: Survey research and exploratory factor analysis (EFA). \u0000Data collection & analysis: The ETI was administered to participants in a study measuring the effectiveness of a tool used to conduct internal evaluations of gifted education programs. SPSS was used to conduct an EFA on 96 completed ETIs. Cronbach’s alpha was used to estimate the internal consistency of the instrument. \u0000Findings: The analysis of the ETI revealed a two-factor model of evaluative thinking (i.e., believe in and practice evaluation and pose thoughtful questions and seek alternatives). This study also provided internal consistency evidence for the ETI showing alpha reliabilities for the two factors ranging from 0.80 to 0.82. The ETI has potentially wide applicability in research and practice in ECB and in the field of evaluation more generally. \u0000Keywords: evaluative thinking; evaluation capacity building; research on evaluation; exploratory factor analysis.","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46038831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using PhotoVoice as an Evaluation Method","authors":"O. Hunter, Emma Leeburg, Michael A. Harnar","doi":"10.56645/jmde.v16i34.603","DOIUrl":"https://doi.org/10.56645/jmde.v16i34.603","url":null,"abstract":"Background: Engaging with youth through PhotoVoice is beneficial as a program evaluation method and functions as a method of inquiry to understand youths’ perceptions of a college preparation program. The students used PhotoVoice to respond to prompts about how they learn and their opinions of the college preparation program. \u0000Purpose: This reflection of practice article provides an example of PhotoVoice as an evaluation method. \u0000Setting: This evaluation was conducted during the summer college preparation programming. \u0000Intervention: The combination of student photography, photo gallery walk, and group discussion as an evaluation method. \u0000Research Design: A qualitative reflective design. \u0000Data Collection and Analysis: The student photographs, narratives, and observational notes were analyzed thematically. \u0000Findings: We found that the youths enjoyed the unique experience of PhotoVoice. The combination of student photography, narratives, gallery walk, and group discussion, were useful as an inquiry tool for this youth program.","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47005439","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adopting Tools from Cost and Management Accounting to Improve the Manner in Which Costs in Social Programs are Analyzed and Evaluated","authors":"Nadini Persaud","doi":"10.56645/jmde.v16i34.615","DOIUrl":"https://doi.org/10.56645/jmde.v16i34.615","url":null,"abstract":"\u0000 \u0000 \u0000 \u0000Background: Managing programs in an environment where financial resources are limited, budget cuts are a reality, and external funding is now fiercely competitive, necessitate that both program administrators and program evaluators have a better understanding of program costs, so that financial resources can be optimized for societal good. This requires serious analysis of cost behavior and a proper understanding of the relationship between a program's variable costs and fixed costs since these costs have implications for clients fees and the number of clients that can be served. These types of analyses are quite routine in the profitability sector, but are considerably underutilized in other sectors. \u0000Purpose: This paper will explain how several common strategic management tools from cost and management accounting can be used to present more meaningful and useful cost information, so that social program decision-making and cost-inclusive evaluations can be enhanced. \u0000Setting: N/A. \u0000 \u0000 \u0000Intervention: N/A. \u0000Research Design: A desk review was utilized for the discussion of the cost and management accounting concepts and tools outlined in this paper. The paper illustrates how the toolkit of economic evaluation tools can be enhanced by adding tools from cost and management accounting to enhance strategic decision-making. \u0000Findings: This paper concludes by noting that program sustainability must be the new name of the game. This necessitates that program administrators and program evaluators start to analyze and evaluate program costs differently. Much work is needed to move towards a different philosophy of thinking with regards to program costs. Program administrators and program evaluators must therefore rise to the challenge and embrace cost analytical methodologies from other disciplines since the use of such methodologies can be beneficial to all concerned. \u0000 \u0000 \u0000 \u0000 \u0000 ","PeriodicalId":91909,"journal":{"name":"Journal of multidisciplinary evaluation","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44952499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}