{"title":"The equivalence of a high-stakes objective structured clinical exam adapted to suit a virtual delivery format","authors":"Karen Coetzee MA, Luxshi Amirthalingam BSc, Tabasom Eftekari BComm, Sandra Monteiro PhD","doi":"10.1111/jep.14167","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <h3> Introduction</h3>\n \n <p>The COVID-19 pandemic necessitated rapid adaptation of clinical competence assessments, including the transition of Objective Structured Clinical Examinations (OSCE) from in-person to virtual formats. This study investigates the construct equivalence of a high-stakes OSCE, originally designed for in-person delivery, when adapted for a virtual format.</p>\n </section>\n \n <section>\n \n <h3> Methods</h3>\n \n <p>A retrospective analysis was conducted using OSCE scores from the Internationally Educated Nurse Competency Assessment Program (IENCAP®). Data were collected from 15 exam administrations between January 2018 and June 2022, encompassing 2021 examinees (1936 in-person, 85 virtual). The Many-Facet Rasch Measurement (MFRM) model was employed to analyze the invariance of examinee ability, case difficulty, and criteria difficulty across in-person and virtual formats.</p>\n </section>\n \n <section>\n \n <h3> Results</h3>\n \n <p>Results revealed overall examinee ability estimates remained invariant regardless of the OSCE format, while invariant violations were identified in only three of the 15 cases (<i>N</i> = 20%) adapted to suit the virtual format. The most significant adaptation, namely the use of a verbal physical examination to suit the virtual context achieved equivalence to its hands-on in-person counterpart given evidence of invariance across criteria estimates. Interestingly, criteria scores in invariant violated cases displayed a higher level of stability or consistency across the virtual OSCE formats versus their in-person counterpart highlighting a potential benefit of the virtual versus in-person format and potentially linked to the verbal physical examination.</p>\n </section>\n \n <section>\n \n <h3> Conclusion</h3>\n \n <p>The study found that while examinee ability and case difficulty estimates exhibited some invariance between in-person and virtual OSCE formats, criteria involving physical assessments faced challenges in maintaining construct equivalence. These findings highlight the need for careful consideration in adapting high-stakes clinical assessments to virtual formats to ensure fairness and reliability.</p>\n </section>\n </div>","PeriodicalId":15997,"journal":{"name":"Journal of evaluation in clinical practice","volume":"31 1","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2024-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/jep.14167","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of evaluation in clinical practice","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/jep.14167","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
引用次数: 0
Abstract
Introduction
The COVID-19 pandemic necessitated rapid adaptation of clinical competence assessments, including the transition of Objective Structured Clinical Examinations (OSCE) from in-person to virtual formats. This study investigates the construct equivalence of a high-stakes OSCE, originally designed for in-person delivery, when adapted for a virtual format.
Methods
A retrospective analysis was conducted using OSCE scores from the Internationally Educated Nurse Competency Assessment Program (IENCAP®). Data were collected from 15 exam administrations between January 2018 and June 2022, encompassing 2021 examinees (1936 in-person, 85 virtual). The Many-Facet Rasch Measurement (MFRM) model was employed to analyze the invariance of examinee ability, case difficulty, and criteria difficulty across in-person and virtual formats.
Results
Results revealed overall examinee ability estimates remained invariant regardless of the OSCE format, while invariant violations were identified in only three of the 15 cases (N = 20%) adapted to suit the virtual format. The most significant adaptation, namely the use of a verbal physical examination to suit the virtual context achieved equivalence to its hands-on in-person counterpart given evidence of invariance across criteria estimates. Interestingly, criteria scores in invariant violated cases displayed a higher level of stability or consistency across the virtual OSCE formats versus their in-person counterpart highlighting a potential benefit of the virtual versus in-person format and potentially linked to the verbal physical examination.
Conclusion
The study found that while examinee ability and case difficulty estimates exhibited some invariance between in-person and virtual OSCE formats, criteria involving physical assessments faced challenges in maintaining construct equivalence. These findings highlight the need for careful consideration in adapting high-stakes clinical assessments to virtual formats to ensure fairness and reliability.
期刊介绍:
The Journal of Evaluation in Clinical Practice aims to promote the evaluation and development of clinical practice across medicine, nursing and the allied health professions. All aspects of health services research and public health policy analysis and debate are of interest to the Journal whether studied from a population-based or individual patient-centred perspective. Of particular interest to the Journal are submissions on all aspects of clinical effectiveness and efficiency including evidence-based medicine, clinical practice guidelines, clinical decision making, clinical services organisation, implementation and delivery, health economic evaluation, health process and outcome measurement and new or improved methods (conceptual and statistical) for systematic inquiry into clinical practice. Papers may take a classical quantitative or qualitative approach to investigation (or may utilise both techniques) or may take the form of learned essays, structured/systematic reviews and critiques.