Han Ting Jillian Yeo, Dujeepa Dasharatha Samarasekera, Michael Dean
{"title":"利用反馈机制来提高新加坡客观结构化临床检查的质量:一项探索性行动研究。","authors":"Han Ting Jillian Yeo, Dujeepa Dasharatha Samarasekera, Michael Dean","doi":"10.3352/jeehp.2025.22.28","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>Variability in examiner scoring threatens the fairness and reliability of objective structured clinical examinations (OSCEs). While examiner standardization exists, there is currently no structured, psychometric-informed, individualized feedback mechanism for examiners. This study explored the feasibility and perceived value of such a mechanism using an action research approach to co-design and iteratively refine examiner feedback reports.</p><p><strong>Methods: </strong>Two exploratory cycles were conducted between November 2023 and June 2024 with phase 4 OSCE examiners at the Yong Loo Lin School of Medicine. In cycle 1, psychometric analyses of examiner scoring for a phase 4 OSCE informed the design of individualized reports, which were evaluated through interviews. Revisions were made to the format of the report and implemented in cycle 2, where examiner responses were again collected. Data were analyzed thematically, supported by reflective logs and field notes.</p><p><strong>Results: </strong>Nine examiners participated in cycle 1 and 7 in cycle 2. In cycle 1, examiners highlighted challenges in interpreting complex terminology, leading to report refinements such as glossaries and visual graphs. In cycle 2, examiners demonstrated greater confidence in applying feedback, requested longitudinal reports, and shifted from initial resistance to reflective engagement. Across cycles, the reports improved credibility, neutrality, and examiner self-regulation.</p><p><strong>Conclusion: </strong>This exploratory study suggests that psychometric-informed feedback reports can facilitate examiner reflection and transparency in OSCEs. While the findings highlight feasibility and examiner acceptance, longitudinal delivery of feedback, collection of quantitative outcome data, and larger samples are needed to establish whether such reports improve scoring consistency and assessment fairness.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"22 ","pages":"28"},"PeriodicalIF":3.7000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Leveraging feedback mechanisms to improve the quality of objective structured clinical examinations in Singapore: an exploratory action research study.\",\"authors\":\"Han Ting Jillian Yeo, Dujeepa Dasharatha Samarasekera, Michael Dean\",\"doi\":\"10.3352/jeehp.2025.22.28\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>Variability in examiner scoring threatens the fairness and reliability of objective structured clinical examinations (OSCEs). While examiner standardization exists, there is currently no structured, psychometric-informed, individualized feedback mechanism for examiners. This study explored the feasibility and perceived value of such a mechanism using an action research approach to co-design and iteratively refine examiner feedback reports.</p><p><strong>Methods: </strong>Two exploratory cycles were conducted between November 2023 and June 2024 with phase 4 OSCE examiners at the Yong Loo Lin School of Medicine. In cycle 1, psychometric analyses of examiner scoring for a phase 4 OSCE informed the design of individualized reports, which were evaluated through interviews. Revisions were made to the format of the report and implemented in cycle 2, where examiner responses were again collected. Data were analyzed thematically, supported by reflective logs and field notes.</p><p><strong>Results: </strong>Nine examiners participated in cycle 1 and 7 in cycle 2. In cycle 1, examiners highlighted challenges in interpreting complex terminology, leading to report refinements such as glossaries and visual graphs. In cycle 2, examiners demonstrated greater confidence in applying feedback, requested longitudinal reports, and shifted from initial resistance to reflective engagement. Across cycles, the reports improved credibility, neutrality, and examiner self-regulation.</p><p><strong>Conclusion: </strong>This exploratory study suggests that psychometric-informed feedback reports can facilitate examiner reflection and transparency in OSCEs. While the findings highlight feasibility and examiner acceptance, longitudinal delivery of feedback, collection of quantitative outcome data, and larger samples are needed to establish whether such reports improve scoring consistency and assessment fairness.</p>\",\"PeriodicalId\":46098,\"journal\":{\"name\":\"Journal of Educational Evaluation for Health Professions\",\"volume\":\"22 \",\"pages\":\"28\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2025-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Educational Evaluation for Health Professions\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3352/jeehp.2025.22.28\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/9/30 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Evaluation for Health Professions","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3352/jeehp.2025.22.28","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/9/30 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Leveraging feedback mechanisms to improve the quality of objective structured clinical examinations in Singapore: an exploratory action research study.
Purpose: Variability in examiner scoring threatens the fairness and reliability of objective structured clinical examinations (OSCEs). While examiner standardization exists, there is currently no structured, psychometric-informed, individualized feedback mechanism for examiners. This study explored the feasibility and perceived value of such a mechanism using an action research approach to co-design and iteratively refine examiner feedback reports.
Methods: Two exploratory cycles were conducted between November 2023 and June 2024 with phase 4 OSCE examiners at the Yong Loo Lin School of Medicine. In cycle 1, psychometric analyses of examiner scoring for a phase 4 OSCE informed the design of individualized reports, which were evaluated through interviews. Revisions were made to the format of the report and implemented in cycle 2, where examiner responses were again collected. Data were analyzed thematically, supported by reflective logs and field notes.
Results: Nine examiners participated in cycle 1 and 7 in cycle 2. In cycle 1, examiners highlighted challenges in interpreting complex terminology, leading to report refinements such as glossaries and visual graphs. In cycle 2, examiners demonstrated greater confidence in applying feedback, requested longitudinal reports, and shifted from initial resistance to reflective engagement. Across cycles, the reports improved credibility, neutrality, and examiner self-regulation.
Conclusion: This exploratory study suggests that psychometric-informed feedback reports can facilitate examiner reflection and transparency in OSCEs. While the findings highlight feasibility and examiner acceptance, longitudinal delivery of feedback, collection of quantitative outcome data, and larger samples are needed to establish whether such reports improve scoring consistency and assessment fairness.
期刊介绍:
Journal of Educational Evaluation for Health Professions aims to provide readers the state-of-the art practical information on the educational evaluation for health professions so that to increase the quality of undergraduate, graduate, and continuing education. It is specialized in educational evaluation including adoption of measurement theory to medical health education, promotion of high stakes examination such as national licensing examinations, improvement of nationwide or international programs of education, computer-based testing, computerized adaptive testing, and medical health regulatory bodies. Its field comprises a variety of professions that address public medical health as following but not limited to: Care workers Dental hygienists Dental technicians Dentists Dietitians Emergency medical technicians Health educators Medical record technicians Medical technologists Midwives Nurses Nursing aides Occupational therapists Opticians Oriental medical doctors Oriental medicine dispensers Oriental pharmacists Pharmacists Physical therapists Physicians Prosthetists and Orthotists Radiological technologists Rehabilitation counselor Sanitary technicians Speech-language therapists.