{"title":"心理学专业中客观结构化临床检查的心理测量质量:系统回顾","authors":"Azaan Vhora, Ryan L. Davies, Kylie Rice","doi":"10.1177/14757257231196707","DOIUrl":null,"url":null,"abstract":"Background: Objective Structured Clinical Examinations (OSCEs) are a simulation-based assessment tool used extensively in medical education for evaluating clinical competence. OSCEs are widely regarded as more valid, reliable, and valuable compared to traditional assessment measures, and are now emerging within professional psychology training programs. While there is a lack of findings related to the quality of OSCEs in published psychology literature, psychometric properties can be inferred by investigating implementation. Accordingly, the current review assessed implementation of OSCEs within psychology programs against a set of Quality Assurance Guidelines (QAGs). Methods: A systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) recommendations. Electronic databases including ProQuest Psychology, PsycArticles, Psychology and Behavioural Sciences Collection, PsycInfo and key indexing databases such as Scopus, ProQuest, and Web of Science were used to identify relevant articles. Twelve full-text articles met all inclusion criteria and were included in the review. Results: There was considerable heterogeneity in the quality of studies and reporting of OSCE data. Implementation of OSCEs against QAGs revealed overall adherence to be “Fair.” Conclusion: The current review consolidated what is known on psychometric quality of OSCEs within psychology programs. A further need for quantitative evidence on psychometric soundness of OSCEs within psychology training is highlighted. Furthermore, it is recommended that future training programs implement and report OSCEs in accordance with standardized guidelines.","PeriodicalId":345415,"journal":{"name":"Psychology Learning & Teaching","volume":"87 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Psychometric Quality of Objective Structured Clinical Examinations Within Psychology Programs: A Systematic Review\",\"authors\":\"Azaan Vhora, Ryan L. Davies, Kylie Rice\",\"doi\":\"10.1177/14757257231196707\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Background: Objective Structured Clinical Examinations (OSCEs) are a simulation-based assessment tool used extensively in medical education for evaluating clinical competence. OSCEs are widely regarded as more valid, reliable, and valuable compared to traditional assessment measures, and are now emerging within professional psychology training programs. While there is a lack of findings related to the quality of OSCEs in published psychology literature, psychometric properties can be inferred by investigating implementation. Accordingly, the current review assessed implementation of OSCEs within psychology programs against a set of Quality Assurance Guidelines (QAGs). Methods: A systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) recommendations. Electronic databases including ProQuest Psychology, PsycArticles, Psychology and Behavioural Sciences Collection, PsycInfo and key indexing databases such as Scopus, ProQuest, and Web of Science were used to identify relevant articles. Twelve full-text articles met all inclusion criteria and were included in the review. Results: There was considerable heterogeneity in the quality of studies and reporting of OSCE data. Implementation of OSCEs against QAGs revealed overall adherence to be “Fair.” Conclusion: The current review consolidated what is known on psychometric quality of OSCEs within psychology programs. A further need for quantitative evidence on psychometric soundness of OSCEs within psychology training is highlighted. Furthermore, it is recommended that future training programs implement and report OSCEs in accordance with standardized guidelines.\",\"PeriodicalId\":345415,\"journal\":{\"name\":\"Psychology Learning & Teaching\",\"volume\":\"87 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-09-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Psychology Learning & Teaching\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/14757257231196707\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychology Learning & Teaching","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/14757257231196707","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
背景:目的结构化临床考试(OSCEs)是一种基于模拟的评估工具,广泛用于医学教育中评估临床能力。与传统的评估方法相比,osce被广泛认为是更有效、更可靠、更有价值的方法,现在正在专业心理学培训项目中出现。虽然在已发表的心理学文献中缺乏与osce质量相关的发现,但可以通过调查实施来推断心理测量特性。因此,根据一套质量保证指南(QAGs),目前的审查评估了心理学项目中oses的实施情况。方法:根据系统评价和荟萃分析(PRISMA)推荐的首选报告项目进行系统评价。使用ProQuest Psychology、PsycArticles、Psychology and Behavioural Sciences Collection、PsycInfo等电子数据库和Scopus、ProQuest、Web of Science等关键索引数据库对相关文章进行检索。12篇全文文章符合所有纳入标准,纳入本综述。结果:欧安组织数据的研究质量和报告存在相当大的异质性。欧安组织对质量保证体系的执行表明,总体上遵守是“公平的”。结论:目前的审查巩固了什么是已知的心理测量质量的oses在心理学专业。强调了在心理学培训中进一步需要关于欧安组织的心理测量合理性的定量证据。此外,建议未来的培训计划按照标准化指南实施和报告欧安组织。
The Psychometric Quality of Objective Structured Clinical Examinations Within Psychology Programs: A Systematic Review
Background: Objective Structured Clinical Examinations (OSCEs) are a simulation-based assessment tool used extensively in medical education for evaluating clinical competence. OSCEs are widely regarded as more valid, reliable, and valuable compared to traditional assessment measures, and are now emerging within professional psychology training programs. While there is a lack of findings related to the quality of OSCEs in published psychology literature, psychometric properties can be inferred by investigating implementation. Accordingly, the current review assessed implementation of OSCEs within psychology programs against a set of Quality Assurance Guidelines (QAGs). Methods: A systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) recommendations. Electronic databases including ProQuest Psychology, PsycArticles, Psychology and Behavioural Sciences Collection, PsycInfo and key indexing databases such as Scopus, ProQuest, and Web of Science were used to identify relevant articles. Twelve full-text articles met all inclusion criteria and were included in the review. Results: There was considerable heterogeneity in the quality of studies and reporting of OSCE data. Implementation of OSCEs against QAGs revealed overall adherence to be “Fair.” Conclusion: The current review consolidated what is known on psychometric quality of OSCEs within psychology programs. A further need for quantitative evidence on psychometric soundness of OSCEs within psychology training is highlighted. Furthermore, it is recommended that future training programs implement and report OSCEs in accordance with standardized guidelines.