Effect of limiting checklist on the validity of objective structured clinical examination: A comparative validity study.

IF 3.3 2区 教育学 Q1 EDUCATION, SCIENTIFIC DISCIPLINES
Sun Jung Myung, Ju Whi Kim, Chan Woong Kim, Do Hoon Kim, Eungkyung Eo, Jong Hoon Kim, Jae Jin Han, Sangyoung Bae
{"title":"Effect of limiting checklist on the validity of objective structured clinical examination: A comparative validity study.","authors":"Sun Jung Myung, Ju Whi Kim, Chan Woong Kim, Do Hoon Kim, Eungkyung Eo, Jong Hoon Kim, Jae Jin Han, Sangyoung Bae","doi":"10.1080/0142159X.2024.2430364","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The Objective Structured Clinical Examination (OSCE) is a cornerstone of medical education that uses a structured approach to assess clinical skills and competency. A well-designed checklist is essential to enhance the validity of OSCE exams. This study aimed to determine whether a clinically discriminatory checklist (CDC) improves the validity of the OSCE compared with an assessment using the thoroughness checklist (TC), with a particular focus on clinical reasoning.</p><p><strong>Methods: </strong>Fourteen OSCE case scenarios with both TC and CDC were developed. Each case was administered to 350-1170 fourth-year medical students in nine medical schools within the Seoul-Gyeonggi-area (Korea) during their OSCEs in 2019 and 2020. We also conducted interstation examinations after standardized patient encounters to assess clinical reasoning ability. The validities of OSCE scores based on the TCs and CDCs were compared.</p><p><strong>Results: </strong>The OSCE using a CDC (rather than a TC) enabled better item discrimination but provided a lower internal consistency coefficient and worse standard measurement error. Clinical reasoning scores derived using patient notes were significantly correlated with OSCE scores but varied according to the characteristics of each case, indicating that OSCE scores derived using CDCs did not assess clinical reasoning ability more accurately than OSCE scores obtained using TCs.</p><p><strong>Conclusions: </strong>This study found that using a CDC to limit checklist items did not improve OSCE validity and did not reflect clinical reasoning ability. Further development of robust assessment strategies that support and evaluate clinical reasoning abilities is needed.</p>","PeriodicalId":18643,"journal":{"name":"Medical Teacher","volume":" ","pages":"1-7"},"PeriodicalIF":3.3000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Medical Teacher","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/0142159X.2024.2430364","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0

Abstract

Background: The Objective Structured Clinical Examination (OSCE) is a cornerstone of medical education that uses a structured approach to assess clinical skills and competency. A well-designed checklist is essential to enhance the validity of OSCE exams. This study aimed to determine whether a clinically discriminatory checklist (CDC) improves the validity of the OSCE compared with an assessment using the thoroughness checklist (TC), with a particular focus on clinical reasoning.

Methods: Fourteen OSCE case scenarios with both TC and CDC were developed. Each case was administered to 350-1170 fourth-year medical students in nine medical schools within the Seoul-Gyeonggi-area (Korea) during their OSCEs in 2019 and 2020. We also conducted interstation examinations after standardized patient encounters to assess clinical reasoning ability. The validities of OSCE scores based on the TCs and CDCs were compared.

Results: The OSCE using a CDC (rather than a TC) enabled better item discrimination but provided a lower internal consistency coefficient and worse standard measurement error. Clinical reasoning scores derived using patient notes were significantly correlated with OSCE scores but varied according to the characteristics of each case, indicating that OSCE scores derived using CDCs did not assess clinical reasoning ability more accurately than OSCE scores obtained using TCs.

Conclusions: This study found that using a CDC to limit checklist items did not improve OSCE validity and did not reflect clinical reasoning ability. Further development of robust assessment strategies that support and evaluate clinical reasoning abilities is needed.

限制性检查表对客观结构化临床检查有效性的影响:有效性比较研究。
背景:客观结构化临床考试(OSCE)是医学教育的基石,它采用结构化方法评估临床技能和能力。精心设计的核对表对提高 OSCE 考试的有效性至关重要。本研究旨在确定与使用彻底性检查表(TC)进行评估相比,临床鉴别性检查表(CDC)是否能提高 OSCE 考试的有效性,尤其是在临床推理方面:方法:开发了 14 个同时使用 TC 和 CDC 的 OSCE 案例。每个案例都在 2019 年和 2020 年的 OSCE 考试中对首尔-京畿地区(韩国)9 所医学院的 350-1170 名四年级医学生进行了评估。我们还在标准化病人会诊后进行了站间考试,以评估临床推理能力。我们比较了基于TC和CDC的OSCE评分的有效性:使用CDC(而非TC)的OSCE能更好地区分项目,但内部一致性系数较低,标准测量误差较小。使用病人笔记得出的临床推理分数与 OSCE 分数有显著相关性,但根据每个病例的特点而有所不同,这表明使用 CDC 得出的 OSCE 分数并不比使用 TC 得出的 OSCE 分数更准确地评估临床推理能力:本研究发现,使用 CDC 限制检查表项目并不能提高 OSCE 的有效性,也不能反映临床推理能力。需要进一步开发支持和评估临床推理能力的强大评估策略。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Medical Teacher
Medical Teacher 医学-卫生保健
CiteScore
7.80
自引率
8.50%
发文量
396
审稿时长
3-6 weeks
期刊介绍: Medical Teacher provides accounts of new teaching methods, guidance on structuring courses and assessing achievement, and serves as a forum for communication between medical teachers and those involved in general education. In particular, the journal recognizes the problems teachers have in keeping up-to-date with the developments in educational methods that lead to more effective teaching and learning at a time when the content of the curriculum—from medical procedures to policy changes in health care provision—is also changing. The journal features reports of innovation and research in medical education, case studies, survey articles, practical guidelines, reviews of current literature and book reviews. All articles are peer reviewed.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信