M. Seeber, Jef Vlegels, Elwin Reimink, A. Marušić, David G. Pina
{"title":"评审经验是否能减少提案评估中的分歧?来自Marie Skłodowska-Curie和COST Actions的见解","authors":"M. Seeber, Jef Vlegels, Elwin Reimink, A. Marušić, David G. Pina","doi":"10.1093/RESEVAL/RVAB011","DOIUrl":null,"url":null,"abstract":"\n We have limited understanding of why reviewers tend to strongly disagree when scoring the same research proposal. Thus far, research that explored disagreement has focused on the characteristics of the proposal or the applicants, while ignoring the characteristics of the reviewers themselves. This article aims to address this gap by exploring which reviewer characteristics most affect disagreement among reviewers. We present hypotheses regarding the effect of a reviewer’s level of experience in evaluating research proposals for a specific granting scheme, that is, scheme reviewing experience. We test our hypotheses by studying two of the most important research funding programmes in the European Union from 2014 to 2018, namely, 52,488 proposals evaluated under three funding schemes of the Horizon 2020 Marie Sklodowska-Curie Actions (MSCA), and 1,939 proposals evaluated under the European Cooperation in Science and Technology Actions. We find that reviewing experience on previous calls of a specific scheme significantly reduces disagreement, while experience of evaluating proposals in other schemes—namely, general reviewing experience, does not have any effect. Moreover, in MSCA—Individual Fellowships, we observe an inverted U relationship between the number of proposals a reviewer evaluates in a given call and disagreement, with a remarkable decrease in disagreement above 13 evaluated proposals. Our results indicate that reviewing experience in a specific scheme improves reliability, curbing unwarranted disagreement by fine-tuning reviewers’ evaluation.","PeriodicalId":47668,"journal":{"name":"Research Evaluation","volume":" ","pages":""},"PeriodicalIF":2.9000,"publicationDate":"2021-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/RESEVAL/RVAB011","citationCount":"11","resultStr":"{\"title\":\"Does reviewing experience reduce disagreement in proposals evaluation? Insights from Marie Skłodowska-Curie and COST Actions\",\"authors\":\"M. Seeber, Jef Vlegels, Elwin Reimink, A. Marušić, David G. Pina\",\"doi\":\"10.1093/RESEVAL/RVAB011\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n We have limited understanding of why reviewers tend to strongly disagree when scoring the same research proposal. Thus far, research that explored disagreement has focused on the characteristics of the proposal or the applicants, while ignoring the characteristics of the reviewers themselves. This article aims to address this gap by exploring which reviewer characteristics most affect disagreement among reviewers. We present hypotheses regarding the effect of a reviewer’s level of experience in evaluating research proposals for a specific granting scheme, that is, scheme reviewing experience. We test our hypotheses by studying two of the most important research funding programmes in the European Union from 2014 to 2018, namely, 52,488 proposals evaluated under three funding schemes of the Horizon 2020 Marie Sklodowska-Curie Actions (MSCA), and 1,939 proposals evaluated under the European Cooperation in Science and Technology Actions. We find that reviewing experience on previous calls of a specific scheme significantly reduces disagreement, while experience of evaluating proposals in other schemes—namely, general reviewing experience, does not have any effect. Moreover, in MSCA—Individual Fellowships, we observe an inverted U relationship between the number of proposals a reviewer evaluates in a given call and disagreement, with a remarkable decrease in disagreement above 13 evaluated proposals. Our results indicate that reviewing experience in a specific scheme improves reliability, curbing unwarranted disagreement by fine-tuning reviewers’ evaluation.\",\"PeriodicalId\":47668,\"journal\":{\"name\":\"Research Evaluation\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2021-04-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1093/RESEVAL/RVAB011\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Research Evaluation\",\"FirstCategoryId\":\"91\",\"ListUrlMain\":\"https://doi.org/10.1093/RESEVAL/RVAB011\",\"RegionNum\":4,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"INFORMATION SCIENCE & LIBRARY SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Evaluation","FirstCategoryId":"91","ListUrlMain":"https://doi.org/10.1093/RESEVAL/RVAB011","RegionNum":4,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
Does reviewing experience reduce disagreement in proposals evaluation? Insights from Marie Skłodowska-Curie and COST Actions
We have limited understanding of why reviewers tend to strongly disagree when scoring the same research proposal. Thus far, research that explored disagreement has focused on the characteristics of the proposal or the applicants, while ignoring the characteristics of the reviewers themselves. This article aims to address this gap by exploring which reviewer characteristics most affect disagreement among reviewers. We present hypotheses regarding the effect of a reviewer’s level of experience in evaluating research proposals for a specific granting scheme, that is, scheme reviewing experience. We test our hypotheses by studying two of the most important research funding programmes in the European Union from 2014 to 2018, namely, 52,488 proposals evaluated under three funding schemes of the Horizon 2020 Marie Sklodowska-Curie Actions (MSCA), and 1,939 proposals evaluated under the European Cooperation in Science and Technology Actions. We find that reviewing experience on previous calls of a specific scheme significantly reduces disagreement, while experience of evaluating proposals in other schemes—namely, general reviewing experience, does not have any effect. Moreover, in MSCA—Individual Fellowships, we observe an inverted U relationship between the number of proposals a reviewer evaluates in a given call and disagreement, with a remarkable decrease in disagreement above 13 evaluated proposals. Our results indicate that reviewing experience in a specific scheme improves reliability, curbing unwarranted disagreement by fine-tuning reviewers’ evaluation.
期刊介绍:
Research Evaluation is a peer-reviewed, international journal. It ranges from the individual research project up to inter-country comparisons of research performance. Research projects, researchers, research centres, and the types of research output are all relevant. It includes public and private sectors, natural and social sciences. The term "evaluation" applies to all stages from priorities and proposals, through the monitoring of on-going projects and programmes, to the use of the results of research.