Michelle P. Martín-Raugh, Emily A. Gallegos, Katrisha M. Smith, Ricardo R. Brooks, Harrison J. Kell
{"title":"The Validity of Single-Response Situational Judgment Tests: A Nomological Network Meta-Analysis","authors":"Michelle P. Martín-Raugh, Emily A. Gallegos, Katrisha M. Smith, Ricardo R. Brooks, Harrison J. Kell","doi":"10.1111/ijsa.70025","DOIUrl":null,"url":null,"abstract":"<p>Nearly 15 years after the first empirical validation of the then-novel single-response situational judgment test (SJT) methodology, research using single-response SJTs has proliferated. Single-response SJTs simply feature one edited critical incident that is evaluated by respondents–hence, the term “single-response” SJT. Single-response SJT items bypass the need for experts to generate and evaluate response options, simplifying and reducing the cost of test construction. We report the first meta-analysis of the criterion-related validity of single-response SJTs and explore the nomological network surrounding the procedural knowledge measured by this format. Results from a random-effects meta-analysis (<i>k</i> = 20, <i>N</i> = 3685) demonstrate that associations between antecedents of single-response SJT scores and criteria mirrored those in the multiple-response SJT literature, with positive associations in all cases. The reliability estimates for single-response SJTs ranged from <i>⍺</i> = 0.37 to <i>⍺</i> = 0.93, with an average of <i>⍺</i> = 0.82. The 95% confidence interval for the uncorrected correlation for single-response SJTs (95% CI [0.12, 28]) encompasses the validity correlations for multiple-response SJTs reported by McDaniel et al. (2007) (0.20, 0.26). We found that single-response SJTs correlated 0.18 (uncorrected) and 0.20 (corrected) with job performance. Additionally, we meta-analyze the correlations between single-response SJTs scores, personality and emotional intelligence, and also explore their criterion-related validity. Despite the nascency of this study area and that most studies were conducted in low-stakes lab settings, findings suggest that overall, single-response SJTs may be promising personnel selection tools.</p>","PeriodicalId":51465,"journal":{"name":"International Journal of Selection and Assessment","volume":"33 4","pages":""},"PeriodicalIF":2.4000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/ijsa.70025","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Selection and Assessment","FirstCategoryId":"91","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/ijsa.70025","RegionNum":4,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0
Abstract
Nearly 15 years after the first empirical validation of the then-novel single-response situational judgment test (SJT) methodology, research using single-response SJTs has proliferated. Single-response SJTs simply feature one edited critical incident that is evaluated by respondents–hence, the term “single-response” SJT. Single-response SJT items bypass the need for experts to generate and evaluate response options, simplifying and reducing the cost of test construction. We report the first meta-analysis of the criterion-related validity of single-response SJTs and explore the nomological network surrounding the procedural knowledge measured by this format. Results from a random-effects meta-analysis (k = 20, N = 3685) demonstrate that associations between antecedents of single-response SJT scores and criteria mirrored those in the multiple-response SJT literature, with positive associations in all cases. The reliability estimates for single-response SJTs ranged from ⍺ = 0.37 to ⍺ = 0.93, with an average of ⍺ = 0.82. The 95% confidence interval for the uncorrected correlation for single-response SJTs (95% CI [0.12, 28]) encompasses the validity correlations for multiple-response SJTs reported by McDaniel et al. (2007) (0.20, 0.26). We found that single-response SJTs correlated 0.18 (uncorrected) and 0.20 (corrected) with job performance. Additionally, we meta-analyze the correlations between single-response SJTs scores, personality and emotional intelligence, and also explore their criterion-related validity. Despite the nascency of this study area and that most studies were conducted in low-stakes lab settings, findings suggest that overall, single-response SJTs may be promising personnel selection tools.
期刊介绍:
The International Journal of Selection and Assessment publishes original articles related to all aspects of personnel selection, staffing, and assessment in organizations. Using an effective combination of academic research with professional-led best practice, IJSA aims to develop new knowledge and understanding in these important areas of work psychology and contemporary workforce management.