Prospects for reducing group mean differences on cognitive tests via item selection strategies.

IF 9.4 1区 心理学 Q1 MANAGEMENT
Isaac M Bazian, Samuel D Lee, Paul R Sackett, Nathan R Kuncel, Rick R Jacobs, Michael A McDaniel
{"title":"Prospects for reducing group mean differences on cognitive tests via item selection strategies.","authors":"Isaac M Bazian, Samuel D Lee, Paul R Sackett, Nathan R Kuncel, Rick R Jacobs, Michael A McDaniel","doi":"10.1037/apl0001253","DOIUrl":null,"url":null,"abstract":"<p><p>Cognitive ability tests are widely used in employee selection contexts, but large race and ethnic subgroup mean differences in test scores represent a major drawback to their use. We examine the potential for an item-level procedure to reduce these test score mean differences. In three data sets, differing proportions of cognitive ability test items with higher levels of difficulty or subgroup mean differences were removed from the tests. The reliabilities of these trimmed tests were then corrected back to the lengths of the original tests, and the subgroup mean differences of the trimmed tests were compared to those of the original tests. Results indicate that it is not possible to come anywhere close to eliminating subgroup differences via item trimming. The procedure may modestly reduce subgroup mean differences in test scores, with effects becoming stronger as higher proportions of items are removed from the tests. Removing items based on difficulty or subgroup differences have roughly similar impacts on test score mean differences for Black-White test taker comparisons, but results are more mixed for Hispanic-White comparisons. Our results also provide preliminary evidence that removing items on the basis of subgroup mean differences may have relatively little effect on test criterion-related validity, but the impact of removing difficult items was more mixed. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":15135,"journal":{"name":"Journal of Applied Psychology","volume":" ","pages":""},"PeriodicalIF":9.4000,"publicationDate":"2024-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/apl0001253","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0

Abstract

Cognitive ability tests are widely used in employee selection contexts, but large race and ethnic subgroup mean differences in test scores represent a major drawback to their use. We examine the potential for an item-level procedure to reduce these test score mean differences. In three data sets, differing proportions of cognitive ability test items with higher levels of difficulty or subgroup mean differences were removed from the tests. The reliabilities of these trimmed tests were then corrected back to the lengths of the original tests, and the subgroup mean differences of the trimmed tests were compared to those of the original tests. Results indicate that it is not possible to come anywhere close to eliminating subgroup differences via item trimming. The procedure may modestly reduce subgroup mean differences in test scores, with effects becoming stronger as higher proportions of items are removed from the tests. Removing items based on difficulty or subgroup differences have roughly similar impacts on test score mean differences for Black-White test taker comparisons, but results are more mixed for Hispanic-White comparisons. Our results also provide preliminary evidence that removing items on the basis of subgroup mean differences may have relatively little effect on test criterion-related validity, but the impact of removing difficult items was more mixed. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

通过项目选择策略缩小认知测试中的群体平均差异的前景。
认知能力测验被广泛应用于员工选拔中,但测验分数在种族和民族亚群中的巨大平均差异是其使用中的一个主要缺陷。我们研究了项目级程序缩小这些测试得分平均差异的可能性。在三组数据中,不同比例的认知能力测验项目难度较高,或子群体平均差异较大,我们将其从测验中删除。然后将这些经过修剪的测验的信度校正回原始测验的长度,并将经过修剪的测验的亚组平均差异与原始测验的亚组平均差异进行比较。结果表明,通过项目修剪不可能完全消除亚组差异。该程序可能会适度地减少测验分数的亚组平均差异,当测验中删除的项目比例越高,效果就越明显。在黑人与白人考生的比较中,根据难度或亚群体差异删除题目对考试成绩均值差异的影响大致相同,但在西班牙裔与白人考生的比较中,结果则不尽相同。我们的研究结果还提供了初步证据,表明根据亚组平均差异删除题目对测验标准相关效度的影响可能相对较小,但删除难度较大的题目所产生的影响则比较复杂。(PsycInfo Database Record (c) 2024 APA, 版权所有)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
17.60
自引率
6.10%
发文量
175
期刊介绍: The Journal of Applied Psychology® focuses on publishing original investigations that contribute new knowledge and understanding to fields of applied psychology (excluding clinical and applied experimental or human factors, which are better suited for other APA journals). The journal primarily considers empirical and theoretical investigations that enhance understanding of cognitive, motivational, affective, and behavioral psychological phenomena in work and organizational settings. These phenomena can occur at individual, group, organizational, or cultural levels, and in various work settings such as business, education, training, health, service, government, or military institutions. The journal welcomes submissions from both public and private sector organizations, for-profit or nonprofit. It publishes several types of articles, including: 1.Rigorously conducted empirical investigations that expand conceptual understanding (original investigations or meta-analyses). 2.Theory development articles and integrative conceptual reviews that synthesize literature and generate new theories on psychological phenomena to stimulate novel research. 3.Rigorously conducted qualitative research on phenomena that are challenging to capture with quantitative methods or require inductive theory building.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信