{"title":"重新审视预测-标准-结构一致性:对人事选择系统设计的启示","authors":"L. Hough, F. Oswald","doi":"10.1017/iop.2023.35","DOIUrl":null,"url":null,"abstract":"Overview In their focal article, Sackett et al. (in press) describe implications of their new meta-analytic estimates of validity of widely used predictors for selection of employees. Contradicting the received wisdom of Schmidt and Hunter (1998), Sackett et al. conclude that predictor methods with content specifically tailored to jobs generally have greater validity for predicting job performance than general measures reflecting psychological constructs (e.g., cognitive abilities, personality traits). They also point out that standard deviations around the mean of their metaanalytic validity estimates are often large, leading to their question “why the variability?” (p. x). They suggest many legitimate contributors. We propose an additional moderator variable of critical importance: predictor-criterion construct congruence, accounting for a great deal of variability in validity coefficients found in meta-analysis. That is, the extent to which what is measured is congruent with what is predicted is an important determinant of the level of validity obtained. Sackett et al. (2022) acknowledge that the strongest predictors in their re-analysis are job-specific measures and that a “closer behavioral match between predictor and criterion” (p. 2062) might contribute to higher validities. Many in our field have also noted the importance of “behavioral consistency” between predictors and criteria relevant to selection, while also arguing for another type of congruence: the relationships between constructs in both the predictor and criterion space (e.g., Bartram, 2005; Campbell et al., 1993; Campbell & Knapp, 2001; Hogan & Holland, 2003; Hough, 1992; Hough & Oswald, 2005; Pulakos et al., 1988; Sackett & Lievens, 2008; Schmitt & Ostroff, 1986). The above reflects an important distinction between two types of congruence: behavior-based congruence and construct-based congruence. When ‘past behavior predicts future behavior’ (as might be possible for jobs requiring past experience and where behavior-oriented employment assessments such as interviews, biodata, and work samples are involved), behavior-based congruence exists. Behavior-based assessments can vary a great deal across jobs but tend to ask about past experiences that are influenced by a complex mix of KSAOs. By contrast, constructbased congruence aligns employment tests of job-relevant KSAOs (e.g., verbal and math skills, conscientiousness) with relevant work criteria, such as technical performance or counterproductive work behavior (e.g., Campbell & Wiernik, 2015). What we are suggesting strongly here is that regardless of the approach to congruence adopted in selection, it is the congruence between predictor and criterion constructs that is a key factor","PeriodicalId":47771,"journal":{"name":"Industrial and Organizational Psychology-Perspectives on Science and Practice","volume":"16 1","pages":"307 - 312"},"PeriodicalIF":11.5000,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Revisiting predictor–criterion construct congruence: Implications for designing personnel selection systems\",\"authors\":\"L. Hough, F. Oswald\",\"doi\":\"10.1017/iop.2023.35\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Overview In their focal article, Sackett et al. (in press) describe implications of their new meta-analytic estimates of validity of widely used predictors for selection of employees. Contradicting the received wisdom of Schmidt and Hunter (1998), Sackett et al. conclude that predictor methods with content specifically tailored to jobs generally have greater validity for predicting job performance than general measures reflecting psychological constructs (e.g., cognitive abilities, personality traits). They also point out that standard deviations around the mean of their metaanalytic validity estimates are often large, leading to their question “why the variability?” (p. x). They suggest many legitimate contributors. We propose an additional moderator variable of critical importance: predictor-criterion construct congruence, accounting for a great deal of variability in validity coefficients found in meta-analysis. That is, the extent to which what is measured is congruent with what is predicted is an important determinant of the level of validity obtained. Sackett et al. (2022) acknowledge that the strongest predictors in their re-analysis are job-specific measures and that a “closer behavioral match between predictor and criterion” (p. 2062) might contribute to higher validities. Many in our field have also noted the importance of “behavioral consistency” between predictors and criteria relevant to selection, while also arguing for another type of congruence: the relationships between constructs in both the predictor and criterion space (e.g., Bartram, 2005; Campbell et al., 1993; Campbell & Knapp, 2001; Hogan & Holland, 2003; Hough, 1992; Hough & Oswald, 2005; Pulakos et al., 1988; Sackett & Lievens, 2008; Schmitt & Ostroff, 1986). The above reflects an important distinction between two types of congruence: behavior-based congruence and construct-based congruence. When ‘past behavior predicts future behavior’ (as might be possible for jobs requiring past experience and where behavior-oriented employment assessments such as interviews, biodata, and work samples are involved), behavior-based congruence exists. Behavior-based assessments can vary a great deal across jobs but tend to ask about past experiences that are influenced by a complex mix of KSAOs. By contrast, constructbased congruence aligns employment tests of job-relevant KSAOs (e.g., verbal and math skills, conscientiousness) with relevant work criteria, such as technical performance or counterproductive work behavior (e.g., Campbell & Wiernik, 2015). What we are suggesting strongly here is that regardless of the approach to congruence adopted in selection, it is the congruence between predictor and criterion constructs that is a key factor\",\"PeriodicalId\":47771,\"journal\":{\"name\":\"Industrial and Organizational Psychology-Perspectives on Science and Practice\",\"volume\":\"16 1\",\"pages\":\"307 - 312\"},\"PeriodicalIF\":11.5000,\"publicationDate\":\"2023-08-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Industrial and Organizational Psychology-Perspectives on Science and Practice\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1017/iop.2023.35\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Industrial and Organizational Psychology-Perspectives on Science and Practice","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1017/iop.2023.35","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
Revisiting predictor–criterion construct congruence: Implications for designing personnel selection systems
Overview In their focal article, Sackett et al. (in press) describe implications of their new meta-analytic estimates of validity of widely used predictors for selection of employees. Contradicting the received wisdom of Schmidt and Hunter (1998), Sackett et al. conclude that predictor methods with content specifically tailored to jobs generally have greater validity for predicting job performance than general measures reflecting psychological constructs (e.g., cognitive abilities, personality traits). They also point out that standard deviations around the mean of their metaanalytic validity estimates are often large, leading to their question “why the variability?” (p. x). They suggest many legitimate contributors. We propose an additional moderator variable of critical importance: predictor-criterion construct congruence, accounting for a great deal of variability in validity coefficients found in meta-analysis. That is, the extent to which what is measured is congruent with what is predicted is an important determinant of the level of validity obtained. Sackett et al. (2022) acknowledge that the strongest predictors in their re-analysis are job-specific measures and that a “closer behavioral match between predictor and criterion” (p. 2062) might contribute to higher validities. Many in our field have also noted the importance of “behavioral consistency” between predictors and criteria relevant to selection, while also arguing for another type of congruence: the relationships between constructs in both the predictor and criterion space (e.g., Bartram, 2005; Campbell et al., 1993; Campbell & Knapp, 2001; Hogan & Holland, 2003; Hough, 1992; Hough & Oswald, 2005; Pulakos et al., 1988; Sackett & Lievens, 2008; Schmitt & Ostroff, 1986). The above reflects an important distinction between two types of congruence: behavior-based congruence and construct-based congruence. When ‘past behavior predicts future behavior’ (as might be possible for jobs requiring past experience and where behavior-oriented employment assessments such as interviews, biodata, and work samples are involved), behavior-based congruence exists. Behavior-based assessments can vary a great deal across jobs but tend to ask about past experiences that are influenced by a complex mix of KSAOs. By contrast, constructbased congruence aligns employment tests of job-relevant KSAOs (e.g., verbal and math skills, conscientiousness) with relevant work criteria, such as technical performance or counterproductive work behavior (e.g., Campbell & Wiernik, 2015). What we are suggesting strongly here is that regardless of the approach to congruence adopted in selection, it is the congruence between predictor and criterion constructs that is a key factor
期刊介绍:
Industrial and Organizational Psychology-Perspectives on Science and Practice is a peer-reviewed academic journal published on behalf of the Society for Industrial and Organizational Psychology. The journal focuses on interactive exchanges on topics of importance to the science and practice of the field. It features articles that present new ideas or different takes on existing ideas, stimulating dialogue about important issues in the field. Additionally, the journal is indexed and abstracted in Clarivate Analytics SSCI, Clarivate Analytics Web of Science, European Reference Index for the Humanities and Social Sciences (ERIH PLUS), ProQuest, PsycINFO, and Scopus.