Comparability of Objective Structured Clinical Examinations (OSCEs) and Written Tests for Assessing Medical School Students' Competencies: A Scoping Review.

IF 2.2 3区 医学 Q2 HEALTH CARE SCIENCES & SERVICES
Oswin Chang, Anne M Holbrook, Simran Lohit, Jiawen Deng, Janice Xu, Munil Lee, Alan Cheng
{"title":"Comparability of Objective Structured Clinical Examinations (OSCEs) and Written Tests for Assessing Medical School Students' Competencies: A Scoping Review.","authors":"Oswin Chang,&nbsp;Anne M Holbrook,&nbsp;Simran Lohit,&nbsp;Jiawen Deng,&nbsp;Janice Xu,&nbsp;Munil Lee,&nbsp;Alan Cheng","doi":"10.1177/01632787231165797","DOIUrl":null,"url":null,"abstract":"<p><p>Objective Structured Clinical Examinations (OSCEs) and written tests are commonly used to assess health professional students, but it remains unclear whether the additional human resources and expenses required for OSCEs, both in-person and online, are worthwhile for assessing competencies. This scoping review summarized literature identified by searching MEDLINE and EMBASE comparing 1) OSCEs and written tests and 2) in-person and online OSCEs, for assessing health professional trainees' competencies. For Q1, 21 studies satisfied inclusion criteria. The most examined health profession was medical trainees (19, 90.5%), the comparison was most frequently OSCEs versus multiple-choice questions (MCQs) (18, 85.7%), and 18 (87.5%) examined the same competency domain. Most (77.5%) total score correlation coefficients between testing methods were weak (<i>r</i> < 0.40). For Q2, 13 articles were included. In-person and online OSCEs were most used for medical trainees (9, 69.2%), checklists were the most prevalent evaluation scheme (7, 63.6%), and 14/17 overall score comparisons were not statistically significantly different. Generally low correlations exist between MCQ and OSCE scores, providing insufficient evidence as to whether OSCEs provide sufficient value to be worth their additional cost. Online OSCEs may be a viable alternative to in-person OSCEs for certain competencies where technical challenges can be met.</p>","PeriodicalId":12315,"journal":{"name":"Evaluation & the Health Professions","volume":"46 3","pages":"213-224"},"PeriodicalIF":2.2000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10443966/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Evaluation & the Health Professions","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1177/01632787231165797","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
引用次数: 0

Abstract

Objective Structured Clinical Examinations (OSCEs) and written tests are commonly used to assess health professional students, but it remains unclear whether the additional human resources and expenses required for OSCEs, both in-person and online, are worthwhile for assessing competencies. This scoping review summarized literature identified by searching MEDLINE and EMBASE comparing 1) OSCEs and written tests and 2) in-person and online OSCEs, for assessing health professional trainees' competencies. For Q1, 21 studies satisfied inclusion criteria. The most examined health profession was medical trainees (19, 90.5%), the comparison was most frequently OSCEs versus multiple-choice questions (MCQs) (18, 85.7%), and 18 (87.5%) examined the same competency domain. Most (77.5%) total score correlation coefficients between testing methods were weak (r < 0.40). For Q2, 13 articles were included. In-person and online OSCEs were most used for medical trainees (9, 69.2%), checklists were the most prevalent evaluation scheme (7, 63.6%), and 14/17 overall score comparisons were not statistically significantly different. Generally low correlations exist between MCQ and OSCE scores, providing insufficient evidence as to whether OSCEs provide sufficient value to be worth their additional cost. Online OSCEs may be a viable alternative to in-person OSCEs for certain competencies where technical challenges can be met.

Abstract Image

Abstract Image

Abstract Image

客观结构化临床考试(oses)和评估医学院学生能力的笔试的可比性:范围审查。
目的结构化临床考试(oses)和笔试通常用于评估卫生专业学生,但目前尚不清楚是否值得为评估能力投入额外的人力资源和费用,无论是面对面的还是在线的。本综述总结了通过检索MEDLINE和EMBASE,比较1)osce和笔试以及2)现场和在线osce,以评估卫生专业受训人员能力的文献。Q1有21项研究符合纳入标准。被检查最多的健康职业是医学培训生(19,90.5%),比较最常见的是osce与多项选择题(mcq)(18, 85.7%), 18(87.5%)检查相同的能力领域。各检测方法间总分相关系数均较弱(r < 0.40),占77.5%。第二季度共收录了13篇文章。医学培训生最常使用面对面和在线osce(9.69.2%),检查表是最常见的评估方案(7.63.6%),14/17总分比较差异无统计学意义。MCQ与欧安组织得分之间的相关性一般较低,因此没有足够的证据表明欧安组织是否提供了足够的价值,值得其额外的成本。对于能够应对技术挑战的某些能力,在线欧安组织可能是面对面欧安组织的可行替代方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.30
自引率
0.00%
发文量
31
审稿时长
>12 weeks
期刊介绍: Evaluation & the Health Professions is a peer-reviewed, quarterly journal that provides health-related professionals with state-of-the-art methodological, measurement, and statistical tools for conceptualizing the etiology of health promotion and problems, and developing, implementing, and evaluating health programs, teaching and training services, and products that pertain to a myriad of health dimensions. This journal is a member of the Committee on Publication Ethics (COPE). Average time from submission to first decision: 31 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信