儿科医学临床推理的文字一致性检验:医学生的表现和专家小组的可靠性

A. Morris, D. Campbell
{"title":"儿科医学临床推理的文字一致性检验:医学生的表现和专家小组的可靠性","authors":"A. Morris, D. Campbell","doi":"10.11157/FOHPE.V16I2.65","DOIUrl":null,"url":null,"abstract":"Background: This study aimed to determine the correlation between student performance in clinical reasoning on the Script Concordance Test (SCT) and a modified essay question (MEQ) exam in a paediatric teaching block and to measure the intra-rater reliability of the expert scoring panel. Method: A 65-item assessment was developed using the accepted SCT method and scored against the responses of a panel of 10 general and subspecialty paediatricians. Student scores for the summative modified essay question examination at the end of the child and adolescent health block were compared with the score on the SCT. Intra-expert reliability was measured for the 10 paediatricians on the expert panel. Results: One hundred and two students completed both the SCT and the MEQ examination, with the correlation coefficient indicating moderate correlation (r = 0.46). The weighted Cohen kappa for the paediatricians on the panel ranged from 0.61–0.86, demonstrating good to excellent intra-rater agreement. Conclusion: We found that the MEQ is not a reliable means of measuring clinical reasoning of medical students, with only moderate correlation with the SCT, and that alternative methods such as SCT should be considered. Our finding of high reliability for paediatricians on the scoring panel is the first published using this methodology. It suggests that for lower stakes examinations, there is no need to re-test examiners. We do, however, propose that this simple method of assessing intra-rater reliability should be considered for high-stakes medical student examinations.","PeriodicalId":306686,"journal":{"name":"Focus on health professional education : a multi-disciplinary journal","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The script concordance test for clinical reasoning in paediatric medicine: Medical student performance and expert panel reliability\",\"authors\":\"A. Morris, D. Campbell\",\"doi\":\"10.11157/FOHPE.V16I2.65\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Background: This study aimed to determine the correlation between student performance in clinical reasoning on the Script Concordance Test (SCT) and a modified essay question (MEQ) exam in a paediatric teaching block and to measure the intra-rater reliability of the expert scoring panel. Method: A 65-item assessment was developed using the accepted SCT method and scored against the responses of a panel of 10 general and subspecialty paediatricians. Student scores for the summative modified essay question examination at the end of the child and adolescent health block were compared with the score on the SCT. Intra-expert reliability was measured for the 10 paediatricians on the expert panel. Results: One hundred and two students completed both the SCT and the MEQ examination, with the correlation coefficient indicating moderate correlation (r = 0.46). The weighted Cohen kappa for the paediatricians on the panel ranged from 0.61–0.86, demonstrating good to excellent intra-rater agreement. Conclusion: We found that the MEQ is not a reliable means of measuring clinical reasoning of medical students, with only moderate correlation with the SCT, and that alternative methods such as SCT should be considered. Our finding of high reliability for paediatricians on the scoring panel is the first published using this methodology. It suggests that for lower stakes examinations, there is no need to re-test examiners. We do, however, propose that this simple method of assessing intra-rater reliability should be considered for high-stakes medical student examinations.\",\"PeriodicalId\":306686,\"journal\":{\"name\":\"Focus on health professional education : a multi-disciplinary journal\",\"volume\":\"21 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-04-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Focus on health professional education : a multi-disciplinary journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.11157/FOHPE.V16I2.65\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Focus on health professional education : a multi-disciplinary journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11157/FOHPE.V16I2.65","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

背景:本研究的目的是确定学生在临床推理的文字一致性测试(SCT)和修改后的论述题(MEQ)考试中的表现之间的相关性,并测量专家评分小组的评分内信度。方法:采用公认的SCT方法进行65项评估,并对10名普通儿科医生和亚专科儿科医生的回答进行评分。在儿童和青少年健康街区结束时,学生的总结性修改作文考试成绩与SCT成绩进行比较。对专家组的10名儿科医生进行专家内信度测量。结果:有102名学生同时完成了SCT和MEQ测试,相关系数为中等相关(r = 0.46)。小组中儿科医生的加权Cohen kappa范围为0.61-0.86,显示出良好到优秀的评分一致性。结论:MEQ不是衡量医学生临床推理能力的可靠方法,与SCT的相关性不高,应考虑采用SCT等替代方法。我们发现儿科医生在评分面板上的高可靠性是第一次使用这种方法发表。它建议,对于低分数的考试,没有必要重新测试考官。然而,我们建议在高风险的医学生考试中,应该考虑使用这种简单的评估评分者内信度的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The script concordance test for clinical reasoning in paediatric medicine: Medical student performance and expert panel reliability
Background: This study aimed to determine the correlation between student performance in clinical reasoning on the Script Concordance Test (SCT) and a modified essay question (MEQ) exam in a paediatric teaching block and to measure the intra-rater reliability of the expert scoring panel. Method: A 65-item assessment was developed using the accepted SCT method and scored against the responses of a panel of 10 general and subspecialty paediatricians. Student scores for the summative modified essay question examination at the end of the child and adolescent health block were compared with the score on the SCT. Intra-expert reliability was measured for the 10 paediatricians on the expert panel. Results: One hundred and two students completed both the SCT and the MEQ examination, with the correlation coefficient indicating moderate correlation (r = 0.46). The weighted Cohen kappa for the paediatricians on the panel ranged from 0.61–0.86, demonstrating good to excellent intra-rater agreement. Conclusion: We found that the MEQ is not a reliable means of measuring clinical reasoning of medical students, with only moderate correlation with the SCT, and that alternative methods such as SCT should be considered. Our finding of high reliability for paediatricians on the scoring panel is the first published using this methodology. It suggests that for lower stakes examinations, there is no need to re-test examiners. We do, however, propose that this simple method of assessing intra-rater reliability should be considered for high-stakes medical student examinations.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信