学术医学推荐信中的种族和民族偏见:系统综述。

IF 5.3 2区 教育学 Q1 EDUCATION, SCIENTIFIC DISCIPLINES
Academic Medicine Pub Date : 2024-09-01 Epub Date: 2024-03-08 DOI:10.1097/ACM.0000000000005688
Saarang R Deshpande, Gina Lepore, Lily Wieland, Jennifer R Kogan
{"title":"学术医学推荐信中的种族和民族偏见:系统综述。","authors":"Saarang R Deshpande, Gina Lepore, Lily Wieland, Jennifer R Kogan","doi":"10.1097/ACM.0000000000005688","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Letters of recommendations (LORs) are key components of academic medicine applications. Given that bias against students and trainees underrepresented in medicine (UIM) has been demonstrated across assessment, achievement, and advancement domains, the authors reviewed studies on LORs to assess racial, ethnic, and UIM differences in LORs. Standardized LORs (SLORs), an increasingly common form of LORs, were also assessed for racial and ethnic differences.</p><p><strong>Method: </strong>A systematic review was conducted for English-language studies that assessed racial or ethnic differences in LORs in academic medicine published from database inception to July 16, 2023. Studies evaluating SLORs underwent data abstraction to evaluate their impact on the given race or ethnicity comparison and outcome variables.</p><p><strong>Results: </strong>Twenty-three studies describing 19,012 applicants and 41,925 LORs were included. Nineteen studies (82.6%) assessed LORs for residency, 4 (17.4%) assessed LORs for fellowship, and none evaluated employment or promotion. Fifteen of 17 studies (88.2%) assessing linguistic differences reported a significant difference in a particular race or ethnicity comparison. Of the 7 studies assessing agentic language (e.g., \"strong,\" \"confident\"), 1 study found fewer agentic terms used for Black and Latinx applicants, and 1 study reported higher agency scores for Asian applicants and applicants of races other than White. There were mixed results for the use of communal and grindstone language in UIM and non-UIM comparisons. Among 6 studies, 4 (66.7%) reported that standout language (e.g., \"exceptional,\" \"outstanding\") was less likely to be ascribed to UIM applicants. Doubt-raising language was more frequently used for UIM trainees. When SLORs and unstructured LORs were compared, fewer linguistic differences were found in SLORs.</p><p><strong>Conclusions: </strong>There is a moderate bias against UIM candidates in the domains of linguistic differences, doubt-raising language, and topics discussed in LORs, which has implications for perceptions of competence and ability in the high-stakes residency and fellowship application process.</p>","PeriodicalId":50929,"journal":{"name":"Academic Medicine","volume":" ","pages":"1032-1037"},"PeriodicalIF":5.3000,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Racial and Ethnic Bias in Letters of Recommendation in Academic Medicine: A Systematic Review.\",\"authors\":\"Saarang R Deshpande, Gina Lepore, Lily Wieland, Jennifer R Kogan\",\"doi\":\"10.1097/ACM.0000000000005688\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>Letters of recommendations (LORs) are key components of academic medicine applications. Given that bias against students and trainees underrepresented in medicine (UIM) has been demonstrated across assessment, achievement, and advancement domains, the authors reviewed studies on LORs to assess racial, ethnic, and UIM differences in LORs. Standardized LORs (SLORs), an increasingly common form of LORs, were also assessed for racial and ethnic differences.</p><p><strong>Method: </strong>A systematic review was conducted for English-language studies that assessed racial or ethnic differences in LORs in academic medicine published from database inception to July 16, 2023. Studies evaluating SLORs underwent data abstraction to evaluate their impact on the given race or ethnicity comparison and outcome variables.</p><p><strong>Results: </strong>Twenty-three studies describing 19,012 applicants and 41,925 LORs were included. Nineteen studies (82.6%) assessed LORs for residency, 4 (17.4%) assessed LORs for fellowship, and none evaluated employment or promotion. Fifteen of 17 studies (88.2%) assessing linguistic differences reported a significant difference in a particular race or ethnicity comparison. Of the 7 studies assessing agentic language (e.g., \\\"strong,\\\" \\\"confident\\\"), 1 study found fewer agentic terms used for Black and Latinx applicants, and 1 study reported higher agency scores for Asian applicants and applicants of races other than White. There were mixed results for the use of communal and grindstone language in UIM and non-UIM comparisons. Among 6 studies, 4 (66.7%) reported that standout language (e.g., \\\"exceptional,\\\" \\\"outstanding\\\") was less likely to be ascribed to UIM applicants. Doubt-raising language was more frequently used for UIM trainees. When SLORs and unstructured LORs were compared, fewer linguistic differences were found in SLORs.</p><p><strong>Conclusions: </strong>There is a moderate bias against UIM candidates in the domains of linguistic differences, doubt-raising language, and topics discussed in LORs, which has implications for perceptions of competence and ability in the high-stakes residency and fellowship application process.</p>\",\"PeriodicalId\":50929,\"journal\":{\"name\":\"Academic Medicine\",\"volume\":\" \",\"pages\":\"1032-1037\"},\"PeriodicalIF\":5.3000,\"publicationDate\":\"2024-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Academic Medicine\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1097/ACM.0000000000005688\",\"RegionNum\":2,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/3/8 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Academic Medicine","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1097/ACM.0000000000005688","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/3/8 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0

摘要

目的:推荐信(LOR)是学术医学申请的关键组成部分。鉴于在评估、成绩和晋升等方面存在对医学领域代表性不足的学生和受训人员(UIM)的偏见,作者回顾了有关推荐信的研究,以评估推荐信中的种族、民族和 UIM 差异。标准化 LORs(SLORs)是一种越来越常见的 LORs 形式,作者也对其种族和民族差异进行了评估:方法:我们对从数据库建立到 2023 年 7 月 16 日发表的评估学术医学中 LORs 的种族或民族差异的英文研究进行了系统回顾。对评估SLORs的研究进行了数据抽取,以评估其对特定种族或民族比较和结果变量的影响:结果:共纳入 23 项研究,描述了 19,012 名申请人和 41,925 份 LOR。19项研究(82.6%)对住院医师资格的LOR进行了评估,4项研究(17.4%)对研究员资格的LOR进行了评估,没有研究对就业或晋升进行评估。在 17 项评估语言差异的研究中,有 15 项(88.2%)报告在特定种族或民族比较中存在显著差异。在 7 项评估代理语言(如 "强势"、"自信")的研究中,1 项研究发现黑人和拉丁裔申请人使用的代理术语较少,1 项研究报告称亚裔申请人和非白人申请人的代理得分较高。在 UIM 与非 UIM 的比较中,使用公共语言和磨刀石语言的结果不一。在 6 项研究中,有 4 项(66.7%)报告说,杰出的语言(如 "卓越的"、"杰出的 "等)较少被归因于统一入学申请者。对未参加统一入学考试的受训者使用的质疑性语言更多。在比较 SLOR 和非结构化 LOR 时,发现 SLOR 的语言差异较小:结论:在语言差异、质疑性语言和LOR中讨论的主题方面,对UIM候选人存在一定程度的偏见,这对申请住院医师和研究员资格的高风险过程中对能力和才干的认识有影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Racial and Ethnic Bias in Letters of Recommendation in Academic Medicine: A Systematic Review.

Purpose: Letters of recommendations (LORs) are key components of academic medicine applications. Given that bias against students and trainees underrepresented in medicine (UIM) has been demonstrated across assessment, achievement, and advancement domains, the authors reviewed studies on LORs to assess racial, ethnic, and UIM differences in LORs. Standardized LORs (SLORs), an increasingly common form of LORs, were also assessed for racial and ethnic differences.

Method: A systematic review was conducted for English-language studies that assessed racial or ethnic differences in LORs in academic medicine published from database inception to July 16, 2023. Studies evaluating SLORs underwent data abstraction to evaluate their impact on the given race or ethnicity comparison and outcome variables.

Results: Twenty-three studies describing 19,012 applicants and 41,925 LORs were included. Nineteen studies (82.6%) assessed LORs for residency, 4 (17.4%) assessed LORs for fellowship, and none evaluated employment or promotion. Fifteen of 17 studies (88.2%) assessing linguistic differences reported a significant difference in a particular race or ethnicity comparison. Of the 7 studies assessing agentic language (e.g., "strong," "confident"), 1 study found fewer agentic terms used for Black and Latinx applicants, and 1 study reported higher agency scores for Asian applicants and applicants of races other than White. There were mixed results for the use of communal and grindstone language in UIM and non-UIM comparisons. Among 6 studies, 4 (66.7%) reported that standout language (e.g., "exceptional," "outstanding") was less likely to be ascribed to UIM applicants. Doubt-raising language was more frequently used for UIM trainees. When SLORs and unstructured LORs were compared, fewer linguistic differences were found in SLORs.

Conclusions: There is a moderate bias against UIM candidates in the domains of linguistic differences, doubt-raising language, and topics discussed in LORs, which has implications for perceptions of competence and ability in the high-stakes residency and fellowship application process.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Academic Medicine
Academic Medicine 医学-卫生保健
CiteScore
7.80
自引率
9.50%
发文量
982
审稿时长
3-6 weeks
期刊介绍: Academic Medicine, the official peer-reviewed journal of the Association of American Medical Colleges, acts as an international forum for exchanging ideas, information, and strategies to address the significant challenges in academic medicine. The journal covers areas such as research, education, clinical care, community collaboration, and leadership, with a commitment to serving the public interest.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信