探索删除(或保留)有偏见的项目的影响:一个基于分类准确性的过程。

IF 3.5 2区 心理学 Q1 PSYCHOLOGY, CLINICAL
Meltem Ozcan, Mark H C Lai
{"title":"探索删除(或保留)有偏见的项目的影响:一个基于分类准确性的过程。","authors":"Meltem Ozcan, Mark H C Lai","doi":"10.1177/10731911241298081","DOIUrl":null,"url":null,"abstract":"<p><p>Psychological test scores are commonly used in high-stakes settings to classify individuals. While measurement invariance across groups is necessary for valid and meaningful inferences of group differences, full measurement invariance rarely holds in practice. The classification accuracy analysis framework aims to quantify the degree and practical impact of noninvariance. However, how to best navigate the next steps remains unclear, and methods devised to account for noninvariance at the group level may be insufficient when the goal is classification. Furthermore, deleting a biased item may improve fairness but negatively affect performance, and replacing the test can be costly. We propose item-level effect size indices that allow test users to make more informed decisions by quantifying the impact of deleting (or retaining) an item on test performance and fairness, provide an illustrative example, and introduce <i>unbiasr</i>, an R package implementing the proposed methods.</p>","PeriodicalId":8577,"journal":{"name":"Assessment","volume":" ","pages":"10731911241298081"},"PeriodicalIF":3.5000,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Exploring the Impact of Deleting (or Retaining) a Biased Item: A Procedure Based on Classification Accuracy.\",\"authors\":\"Meltem Ozcan, Mark H C Lai\",\"doi\":\"10.1177/10731911241298081\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Psychological test scores are commonly used in high-stakes settings to classify individuals. While measurement invariance across groups is necessary for valid and meaningful inferences of group differences, full measurement invariance rarely holds in practice. The classification accuracy analysis framework aims to quantify the degree and practical impact of noninvariance. However, how to best navigate the next steps remains unclear, and methods devised to account for noninvariance at the group level may be insufficient when the goal is classification. Furthermore, deleting a biased item may improve fairness but negatively affect performance, and replacing the test can be costly. We propose item-level effect size indices that allow test users to make more informed decisions by quantifying the impact of deleting (or retaining) an item on test performance and fairness, provide an illustrative example, and introduce <i>unbiasr</i>, an R package implementing the proposed methods.</p>\",\"PeriodicalId\":8577,\"journal\":{\"name\":\"Assessment\",\"volume\":\" \",\"pages\":\"10731911241298081\"},\"PeriodicalIF\":3.5000,\"publicationDate\":\"2024-12-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Assessment\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1177/10731911241298081\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, CLINICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessment","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/10731911241298081","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, CLINICAL","Score":null,"Total":0}
引用次数: 0

摘要

心理测试分数通常用于高风险环境中对个体进行分类。虽然跨群体的测量不变性对于有效和有意义的群体差异推断是必要的,但在实践中很少保持完全的测量不变性。分类精度分析框架旨在量化非不变性的程度和实际影响。然而,如何最好地导航接下来的步骤仍然不清楚,并且当目标是分类时,设计用于解释组级别的非不变性的方法可能是不够的。此外,删除一个有偏见的项目可能会提高公平性,但会对性能产生负面影响,并且替换测试可能代价高昂。我们提出了项目级效应大小指数,通过量化删除(或保留)项目对测试性能和公平性的影响,允许测试用户做出更明智的决定,提供了一个说明性的例子,并介绍了unbiasr,一个实现所提议方法的R包。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Exploring the Impact of Deleting (or Retaining) a Biased Item: A Procedure Based on Classification Accuracy.

Psychological test scores are commonly used in high-stakes settings to classify individuals. While measurement invariance across groups is necessary for valid and meaningful inferences of group differences, full measurement invariance rarely holds in practice. The classification accuracy analysis framework aims to quantify the degree and practical impact of noninvariance. However, how to best navigate the next steps remains unclear, and methods devised to account for noninvariance at the group level may be insufficient when the goal is classification. Furthermore, deleting a biased item may improve fairness but negatively affect performance, and replacing the test can be costly. We propose item-level effect size indices that allow test users to make more informed decisions by quantifying the impact of deleting (or retaining) an item on test performance and fairness, provide an illustrative example, and introduce unbiasr, an R package implementing the proposed methods.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Assessment
Assessment PSYCHOLOGY, CLINICAL-
CiteScore
8.90
自引率
2.60%
发文量
86
期刊介绍: Assessment publishes articles in the domain of applied clinical assessment. The emphasis of this journal is on publication of information of relevance to the use of assessment measures, including test development, validation, and interpretation practices. The scope of the journal includes research that can inform assessment practices in mental health, forensic, medical, and other applied settings. Papers that focus on the assessment of cognitive and neuropsychological functioning, personality, and psychopathology are invited. Most papers published in Assessment report the results of original empirical research, however integrative review articles and scholarly case studies will also be considered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信