Meta-analytical estimates of interrater reliability for direct supervisor performance ratings: Optimism under optimal measurement designs.

IF 9.4 1区 心理学 Q1 MANAGEMENT
Journal of Applied Psychology Pub Date : 2024-03-01 Epub Date: 2023-10-12 DOI:10.1037/apl0001146
Andrew B Speer, Angie Y Delacruz, Lauren J Wegmeyer, James Perrotta
{"title":"Meta-analytical estimates of interrater reliability for direct supervisor performance ratings: Optimism under optimal measurement designs.","authors":"Andrew B Speer, Angie Y Delacruz, Lauren J Wegmeyer, James Perrotta","doi":"10.1037/apl0001146","DOIUrl":null,"url":null,"abstract":"<p><p>Performance appraisal (PA) is used for various organizational purposes and is vital to human resources practices. Despite this, current estimates of PA reliability are low, leading to decades of criticism regarding the use of PA in organizational contexts. In this article, we argue that current meta-analytical interrater reliability (IRR) coefficients are underestimates and do not reflect the reliability of interest to most practitioners and researchers-the reliability of an employee's direct supervisor. To establish the reliability of direct supervisor ratings, those making PA ratings must directly supervise employee job performance instead of nonparallel rater designs (e.g., direct supervisor ratings correlated with ratings from a more senior leader). The current meta-analysis identified 22 independent samples that met this more restrictive study inclusion criterion, finding an average observed IRR of .65. We also report reliability estimates for several important contextual moderators, including whether ratings were completed in operational settings (.60) or for research purposes (.67). In sum, we argue that this study's meta-analytical IRR estimates are the best available estimates of direct supervisor reliability and should be used to guide future research and practice. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":15135,"journal":{"name":"Journal of Applied Psychology","volume":null,"pages":null},"PeriodicalIF":9.4000,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/apl0001146","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/10/12 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0

Abstract

Performance appraisal (PA) is used for various organizational purposes and is vital to human resources practices. Despite this, current estimates of PA reliability are low, leading to decades of criticism regarding the use of PA in organizational contexts. In this article, we argue that current meta-analytical interrater reliability (IRR) coefficients are underestimates and do not reflect the reliability of interest to most practitioners and researchers-the reliability of an employee's direct supervisor. To establish the reliability of direct supervisor ratings, those making PA ratings must directly supervise employee job performance instead of nonparallel rater designs (e.g., direct supervisor ratings correlated with ratings from a more senior leader). The current meta-analysis identified 22 independent samples that met this more restrictive study inclusion criterion, finding an average observed IRR of .65. We also report reliability estimates for several important contextual moderators, including whether ratings were completed in operational settings (.60) or for research purposes (.67). In sum, we argue that this study's meta-analytical IRR estimates are the best available estimates of direct supervisor reliability and should be used to guide future research and practice. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

直接主管绩效评级的评估者间可靠性的元分析估计:最优测量设计下的乐观主义。
绩效评估用于各种组织目的,对人力资源实践至关重要。尽管如此,目前对PA可靠性的估计很低,导致数十年来对组织环境中使用PA的批评。在这篇文章中,我们认为当前的元分析参与者间可靠性(IRR)系数被低估了,并没有反映大多数从业者和研究人员感兴趣的可靠性——员工直接主管的可靠性。为了确定直接主管评级的可靠性,那些进行PA评级的人必须直接监督员工的工作表现,而不是非平行的评级器设计(例如,直接主管评级与更高级领导的评级相关)。目前的荟萃分析确定了22个符合这一更严格的研究纳入标准的独立样本,发现观察到的平均内部收益率为.65。我们还报告了几个重要情境调节因子的可靠性估计,包括评级是在操作环境中完成的(.60)还是出于研究目的完成的(.67)。总之,我们认为本研究的元分析IRR估计是直接主管可靠性的最佳估计,应用于指导未来的研究和实践。(PsycInfo数据库记录(c)2023 APA,保留所有权利)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
17.60
自引率
6.10%
发文量
175
期刊介绍: The Journal of Applied Psychology® focuses on publishing original investigations that contribute new knowledge and understanding to fields of applied psychology (excluding clinical and applied experimental or human factors, which are better suited for other APA journals). The journal primarily considers empirical and theoretical investigations that enhance understanding of cognitive, motivational, affective, and behavioral psychological phenomena in work and organizational settings. These phenomena can occur at individual, group, organizational, or cultural levels, and in various work settings such as business, education, training, health, service, government, or military institutions. The journal welcomes submissions from both public and private sector organizations, for-profit or nonprofit. It publishes several types of articles, including: 1.Rigorously conducted empirical investigations that expand conceptual understanding (original investigations or meta-analyses). 2.Theory development articles and integrative conceptual reviews that synthesize literature and generate new theories on psychological phenomena to stimulate novel research. 3.Rigorously conducted qualitative research on phenomena that are challenging to capture with quantitative methods or require inductive theory building.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信