评审经验是否能减少提案评估中的分歧?来自Marie Skłodowska-Curie和COST Actions的见解

IF 2.9 4区 管理学 Q1 INFORMATION SCIENCE & LIBRARY SCIENCE
M. Seeber, Jef Vlegels, Elwin Reimink, A. Marušić, David G. Pina
{"title":"评审经验是否能减少提案评估中的分歧?来自Marie Skłodowska-Curie和COST Actions的见解","authors":"M. Seeber, Jef Vlegels, Elwin Reimink, A. Marušić, David G. Pina","doi":"10.1093/RESEVAL/RVAB011","DOIUrl":null,"url":null,"abstract":"\n We have limited understanding of why reviewers tend to strongly disagree when scoring the same research proposal. Thus far, research that explored disagreement has focused on the characteristics of the proposal or the applicants, while ignoring the characteristics of the reviewers themselves. This article aims to address this gap by exploring which reviewer characteristics most affect disagreement among reviewers. We present hypotheses regarding the effect of a reviewer’s level of experience in evaluating research proposals for a specific granting scheme, that is, scheme reviewing experience. We test our hypotheses by studying two of the most important research funding programmes in the European Union from 2014 to 2018, namely, 52,488 proposals evaluated under three funding schemes of the Horizon 2020 Marie Sklodowska-Curie Actions (MSCA), and 1,939 proposals evaluated under the European Cooperation in Science and Technology Actions. We find that reviewing experience on previous calls of a specific scheme significantly reduces disagreement, while experience of evaluating proposals in other schemes—namely, general reviewing experience, does not have any effect. Moreover, in MSCA—Individual Fellowships, we observe an inverted U relationship between the number of proposals a reviewer evaluates in a given call and disagreement, with a remarkable decrease in disagreement above 13 evaluated proposals. Our results indicate that reviewing experience in a specific scheme improves reliability, curbing unwarranted disagreement by fine-tuning reviewers’ evaluation.","PeriodicalId":47668,"journal":{"name":"Research Evaluation","volume":" ","pages":""},"PeriodicalIF":2.9000,"publicationDate":"2021-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/RESEVAL/RVAB011","citationCount":"11","resultStr":"{\"title\":\"Does reviewing experience reduce disagreement in proposals evaluation? Insights from Marie Skłodowska-Curie and COST Actions\",\"authors\":\"M. Seeber, Jef Vlegels, Elwin Reimink, A. Marušić, David G. Pina\",\"doi\":\"10.1093/RESEVAL/RVAB011\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n We have limited understanding of why reviewers tend to strongly disagree when scoring the same research proposal. Thus far, research that explored disagreement has focused on the characteristics of the proposal or the applicants, while ignoring the characteristics of the reviewers themselves. This article aims to address this gap by exploring which reviewer characteristics most affect disagreement among reviewers. We present hypotheses regarding the effect of a reviewer’s level of experience in evaluating research proposals for a specific granting scheme, that is, scheme reviewing experience. We test our hypotheses by studying two of the most important research funding programmes in the European Union from 2014 to 2018, namely, 52,488 proposals evaluated under three funding schemes of the Horizon 2020 Marie Sklodowska-Curie Actions (MSCA), and 1,939 proposals evaluated under the European Cooperation in Science and Technology Actions. We find that reviewing experience on previous calls of a specific scheme significantly reduces disagreement, while experience of evaluating proposals in other schemes—namely, general reviewing experience, does not have any effect. Moreover, in MSCA—Individual Fellowships, we observe an inverted U relationship between the number of proposals a reviewer evaluates in a given call and disagreement, with a remarkable decrease in disagreement above 13 evaluated proposals. Our results indicate that reviewing experience in a specific scheme improves reliability, curbing unwarranted disagreement by fine-tuning reviewers’ evaluation.\",\"PeriodicalId\":47668,\"journal\":{\"name\":\"Research Evaluation\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2021-04-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1093/RESEVAL/RVAB011\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Research Evaluation\",\"FirstCategoryId\":\"91\",\"ListUrlMain\":\"https://doi.org/10.1093/RESEVAL/RVAB011\",\"RegionNum\":4,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"INFORMATION SCIENCE & LIBRARY SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Evaluation","FirstCategoryId":"91","ListUrlMain":"https://doi.org/10.1093/RESEVAL/RVAB011","RegionNum":4,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
引用次数: 11

摘要

我们对为什么审稿人在对同一研究提案评分时倾向于强烈反对的理解有限。到目前为止,探讨分歧的研究主要集中在提案或申请人的特征上,而忽略了审稿人本身的特征。本文旨在通过探索哪些审稿人特征最能影响审稿人之间的分歧来解决这一差距。我们提出了关于审稿人的经验水平对评估特定拨款计划的研究提案的影响的假设,即计划审查经验。我们通过研究2014年至2018年欧盟最重要的两项研究资助计划来验证我们的假设,即在地平线2020玛丽·斯克洛多夫斯卡-居里行动(MSCA)的三个资助计划下评估的52,488项提案,以及在欧洲科学技术行动合作下评估的1,939项提案。我们发现,对特定方案的先前呼叫进行审查的经验显著减少了分歧,而对其他方案的提案进行评估的经验-即一般审查经验-没有任何影响。此外,在MSCA-Individual Fellowships中,我们观察到审稿人在给定电话中评估的提案数量与不一致意见之间呈倒U型关系,在13个被评估提案以上,不一致意见显著减少。我们的研究结果表明,在特定方案中评审经验提高了可靠性,通过微调评审者的评价来抑制不必要的分歧。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Does reviewing experience reduce disagreement in proposals evaluation? Insights from Marie Skłodowska-Curie and COST Actions
We have limited understanding of why reviewers tend to strongly disagree when scoring the same research proposal. Thus far, research that explored disagreement has focused on the characteristics of the proposal or the applicants, while ignoring the characteristics of the reviewers themselves. This article aims to address this gap by exploring which reviewer characteristics most affect disagreement among reviewers. We present hypotheses regarding the effect of a reviewer’s level of experience in evaluating research proposals for a specific granting scheme, that is, scheme reviewing experience. We test our hypotheses by studying two of the most important research funding programmes in the European Union from 2014 to 2018, namely, 52,488 proposals evaluated under three funding schemes of the Horizon 2020 Marie Sklodowska-Curie Actions (MSCA), and 1,939 proposals evaluated under the European Cooperation in Science and Technology Actions. We find that reviewing experience on previous calls of a specific scheme significantly reduces disagreement, while experience of evaluating proposals in other schemes—namely, general reviewing experience, does not have any effect. Moreover, in MSCA—Individual Fellowships, we observe an inverted U relationship between the number of proposals a reviewer evaluates in a given call and disagreement, with a remarkable decrease in disagreement above 13 evaluated proposals. Our results indicate that reviewing experience in a specific scheme improves reliability, curbing unwarranted disagreement by fine-tuning reviewers’ evaluation.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Research Evaluation
Research Evaluation INFORMATION SCIENCE & LIBRARY SCIENCE-
CiteScore
6.00
自引率
18.20%
发文量
42
期刊介绍: Research Evaluation is a peer-reviewed, international journal. It ranges from the individual research project up to inter-country comparisons of research performance. Research projects, researchers, research centres, and the types of research output are all relevant. It includes public and private sectors, natural and social sciences. The term "evaluation" applies to all stages from priorities and proposals, through the monitoring of on-going projects and programmes, to the use of the results of research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信