为什么是jusicabit ipsos jusices ?关于竞争性供资小组评价动态的个案研究

IF 2.9 4区 管理学 Q1 INFORMATION SCIENCE & LIBRARY SCIENCE
João M. Santos
{"title":"为什么是jusicabit ipsos jusices ?关于竞争性供资小组评价动态的个案研究","authors":"João M. Santos","doi":"10.1093/reseval/rvac021","DOIUrl":null,"url":null,"abstract":"\n Securing research funding is essential for all researchers. The standard evaluation method for competitive grants is through evaluation by a panel of experts. However, the literature notes that peer review has inherent flaws and is subject to biases, which can arise from differing interpretations of the criteria, the impossibility for a group of reviewers to be experts in all possible topics within their field, and the role of affect. As such, understanding the dynamics at play during panel evaluations is crucial to allow researchers a better chance at securing funding, and also for the reviewers themselves to be aware of the cognitive mechanisms underlying their decision-making. In this study, we conduct a case study based on application and evaluation data for two social sciences panels in a competitive state-funded call in Portugal. Using a mixed-methods approach, we find that qualitative evaluations largely resonate with the evaluation criteria, and the candidate’s scientific output is partially aligned with the qualitative evaluations, but scientometric indicators alone do not significantly influence the candidate’s evaluation. However, the polarity of the qualitative evaluation has a positive influence on the candidate’s evaluation. This paradox is discussed as possibly resulting from the occurrence of a halo effect in the panel’s judgment of the candidates. By providing a multi-methods approach, this study aims to provide insights that can be useful for all stakeholders involved in competitive funding evaluations.","PeriodicalId":47668,"journal":{"name":"Research Evaluation","volume":" ","pages":""},"PeriodicalIF":2.9000,"publicationDate":"2022-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Quis judicabit ipsos judices? A case study on the dynamics of competitive funding panel evaluations\",\"authors\":\"João M. Santos\",\"doi\":\"10.1093/reseval/rvac021\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Securing research funding is essential for all researchers. The standard evaluation method for competitive grants is through evaluation by a panel of experts. However, the literature notes that peer review has inherent flaws and is subject to biases, which can arise from differing interpretations of the criteria, the impossibility for a group of reviewers to be experts in all possible topics within their field, and the role of affect. As such, understanding the dynamics at play during panel evaluations is crucial to allow researchers a better chance at securing funding, and also for the reviewers themselves to be aware of the cognitive mechanisms underlying their decision-making. In this study, we conduct a case study based on application and evaluation data for two social sciences panels in a competitive state-funded call in Portugal. Using a mixed-methods approach, we find that qualitative evaluations largely resonate with the evaluation criteria, and the candidate’s scientific output is partially aligned with the qualitative evaluations, but scientometric indicators alone do not significantly influence the candidate’s evaluation. However, the polarity of the qualitative evaluation has a positive influence on the candidate’s evaluation. This paradox is discussed as possibly resulting from the occurrence of a halo effect in the panel’s judgment of the candidates. By providing a multi-methods approach, this study aims to provide insights that can be useful for all stakeholders involved in competitive funding evaluations.\",\"PeriodicalId\":47668,\"journal\":{\"name\":\"Research Evaluation\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2022-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Research Evaluation\",\"FirstCategoryId\":\"91\",\"ListUrlMain\":\"https://doi.org/10.1093/reseval/rvac021\",\"RegionNum\":4,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"INFORMATION SCIENCE & LIBRARY SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Evaluation","FirstCategoryId":"91","ListUrlMain":"https://doi.org/10.1093/reseval/rvac021","RegionNum":4,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
引用次数: 0

摘要

确保研究经费对所有研究人员都至关重要。竞争性赠款的标准评估方法是由一个专家小组进行评估。然而,文献指出,同行评议有其固有的缺陷,并受到偏见的影响,这可能源于对标准的不同解释,一组评议者不可能在其领域内所有可能的主题上都是专家,以及情感的作用。因此,了解小组评估过程中的动态是至关重要的,这可以让研究人员有更好的机会获得资助,也可以让审稿人自己意识到他们决策背后的认知机制。在本研究中,我们对两个社会科学小组在葡萄牙竞争性国家资助项目中的应用和评估数据进行了案例研究。采用混合方法,我们发现定性评价与评价标准基本一致,候选人的科学产出与定性评价部分一致,但单独的科学计量指标对候选人的评价没有显著影响。然而,定性评价的极性对候选人的评价有积极的影响。这一悖论被认为可能是由于评委会对候选人的判断中出现了光环效应而引起的。通过提供多方法方法,本研究旨在为参与竞争性资金评估的所有利益相关者提供有用的见解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Quis judicabit ipsos judices? A case study on the dynamics of competitive funding panel evaluations
Securing research funding is essential for all researchers. The standard evaluation method for competitive grants is through evaluation by a panel of experts. However, the literature notes that peer review has inherent flaws and is subject to biases, which can arise from differing interpretations of the criteria, the impossibility for a group of reviewers to be experts in all possible topics within their field, and the role of affect. As such, understanding the dynamics at play during panel evaluations is crucial to allow researchers a better chance at securing funding, and also for the reviewers themselves to be aware of the cognitive mechanisms underlying their decision-making. In this study, we conduct a case study based on application and evaluation data for two social sciences panels in a competitive state-funded call in Portugal. Using a mixed-methods approach, we find that qualitative evaluations largely resonate with the evaluation criteria, and the candidate’s scientific output is partially aligned with the qualitative evaluations, but scientometric indicators alone do not significantly influence the candidate’s evaluation. However, the polarity of the qualitative evaluation has a positive influence on the candidate’s evaluation. This paradox is discussed as possibly resulting from the occurrence of a halo effect in the panel’s judgment of the candidates. By providing a multi-methods approach, this study aims to provide insights that can be useful for all stakeholders involved in competitive funding evaluations.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Research Evaluation
Research Evaluation INFORMATION SCIENCE & LIBRARY SCIENCE-
CiteScore
6.00
自引率
18.20%
发文量
42
期刊介绍: Research Evaluation is a peer-reviewed, international journal. It ranges from the individual research project up to inter-country comparisons of research performance. Research projects, researchers, research centres, and the types of research output are all relevant. It includes public and private sectors, natural and social sciences. The term "evaluation" applies to all stages from priorities and proposals, through the monitoring of on-going projects and programmes, to the use of the results of research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信