{"title":"Interpreting the magnitude of predictor effect sizes: It is time for more sensible benchmarks","authors":"Scott Highhouse, Margaret E. Brooks","doi":"10.1017/iop.2023.30","DOIUrl":null,"url":null,"abstract":"Sackett et al. (2021) published a disruptive piece that is summarized in Sackett et al. (2023) focal article. As the authors explain in the focal article, not only did their 2021 paper show that rangerestriction overcorrection has led to inflated estimates of validity for selection devices, but that the new corrections actually alter the rank ordering of predictors established in Schmidt and Hunter (1998). Many are celebrating that structured interviews have supplanted general mental ability for the top validity spot. Others, however, were deflated by the generally shrunken effect sizes associated with the new corrections. According to Sackett et al. (2023), many practitioners feel that these revised estimates do not help our traditional predictors compete for success in the marketplace, leaving many wondering how to effectively communicate the relative efficacy of predictors to key stakeholders. We believe that many scientists and practitioners hold unrealistic standards of success. It is time, therefore, for I–O psychologists to adopt and communicate new benchmarks for evaluating predictor effect sizes.","PeriodicalId":47771,"journal":{"name":"Industrial and Organizational Psychology-Perspectives on Science and Practice","volume":"16 1","pages":"332 - 335"},"PeriodicalIF":11.5000,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Industrial and Organizational Psychology-Perspectives on Science and Practice","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1017/iop.2023.30","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
引用次数: 1
Abstract
Sackett et al. (2021) published a disruptive piece that is summarized in Sackett et al. (2023) focal article. As the authors explain in the focal article, not only did their 2021 paper show that rangerestriction overcorrection has led to inflated estimates of validity for selection devices, but that the new corrections actually alter the rank ordering of predictors established in Schmidt and Hunter (1998). Many are celebrating that structured interviews have supplanted general mental ability for the top validity spot. Others, however, were deflated by the generally shrunken effect sizes associated with the new corrections. According to Sackett et al. (2023), many practitioners feel that these revised estimates do not help our traditional predictors compete for success in the marketplace, leaving many wondering how to effectively communicate the relative efficacy of predictors to key stakeholders. We believe that many scientists and practitioners hold unrealistic standards of success. It is time, therefore, for I–O psychologists to adopt and communicate new benchmarks for evaluating predictor effect sizes.
Sackett et al.(2021)发表了一篇颠覆性的文章,总结在Sackett et al.(2023)焦点文章中。正如作者在重点文章中解释的那样,他们2021年的论文不仅表明,距离限制过度校正导致了对选择设备有效性的高估,而且新的校正实际上改变了Schmidt和Hunter(1998)中建立的预测因子的排名顺序。许多人都在庆祝,结构化面试已经取代了一般的心理能力,成为最有效的面试。然而,另一些人则因与新修正相关的效应大小普遍缩小而泄气。根据Sackett等人(2023)的说法,许多从业者认为这些修订后的估计并不能帮助我们的传统预测器在市场上竞争成功,让许多人想知道如何有效地将预测器的相对功效传达给关键利益相关者。我们认为,许多科学家和实践者对成功持有不切实际的标准。因此,对于I-O心理学家来说,是时候采用和交流评估预测效应大小的新基准了。
期刊介绍:
Industrial and Organizational Psychology-Perspectives on Science and Practice is a peer-reviewed academic journal published on behalf of the Society for Industrial and Organizational Psychology. The journal focuses on interactive exchanges on topics of importance to the science and practice of the field. It features articles that present new ideas or different takes on existing ideas, stimulating dialogue about important issues in the field. Additionally, the journal is indexed and abstracted in Clarivate Analytics SSCI, Clarivate Analytics Web of Science, European Reference Index for the Humanities and Social Sciences (ERIH PLUS), ProQuest, PsycINFO, and Scopus.