Assessing Resident Diagnostic Skills Using a Modified Bronchiolitis Score.

Andrea Rivera-Sepulveda, Muguette Isona
{"title":"Assessing Resident Diagnostic Skills Using a Modified Bronchiolitis Score.","authors":"Andrea Rivera-Sepulveda,&nbsp;Muguette Isona","doi":"10.7199/ped.oncall.2021.10","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Resident milestones are objective instruments that assess the resident's growth, progression in knowledge, and clinical diagnostic reasoning; but they rely on the subjective appraisal of the supervising attending. Little is known about the use of standardized instruments that may complement the evaluation of resident diagnostic skills in the academic setting.</p><p><strong>Objectives: </strong>Evaluate a modified bronchiolitis severity assessment tool by appraising the inter-rater variability and reliability between pediatric attendings and pediatric residents.</p><p><strong>Methods: </strong>Cross-sectional study of children under 24 months of age who presented to a Community Hospital's Emergency Department with bronchiolitis between January-June 2014. A paired pediatric attending and resident evaluated each patient. Evaluation included age-based respiratory rate (RR), retractions, peripheral saturation, and auscultation. Cohen's kappa (K) measured inter-rater agreement. Inter-rater reliability (IRR) was assessed using a one-way random, average measures intra-class correlation (ICC) to evaluate the degree of consistency and magnitude of disagreement between inter-raters. Value of >0.6 was considered substantial for kappa and good internal consistency for ICC.</p><p><strong>Results: </strong>Twenty patients were evaluated. Analysis showed fair agreement for the presence of retractions (K=0.31), auscultation (K=0.33), and total score (K=0.3). The RR (ICC=0.97), SpO<sub>2</sub> (ICC=1.0), auscultation (ICC=0.77), and total score (ICC=0.84) were scored similarly across both raters, indicating excellent IRR. Identification of retractions had the least agreement across all statistical analysis.</p><p><strong>Conclusion: </strong>The use of a standardized instrument, in conjunction with a trained resident-teaching staff, can help identify deficiencies in clinical competencies among residents and facilitate the learning process for the identification of pertinent clinical findings.</p>","PeriodicalId":19949,"journal":{"name":"Pediatric Oncall","volume":"18 1","pages":"11-16"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7935038/pdf/nihms-1664219.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pediatric Oncall","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.7199/ped.oncall.2021.10","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Resident milestones are objective instruments that assess the resident's growth, progression in knowledge, and clinical diagnostic reasoning; but they rely on the subjective appraisal of the supervising attending. Little is known about the use of standardized instruments that may complement the evaluation of resident diagnostic skills in the academic setting.

Objectives: Evaluate a modified bronchiolitis severity assessment tool by appraising the inter-rater variability and reliability between pediatric attendings and pediatric residents.

Methods: Cross-sectional study of children under 24 months of age who presented to a Community Hospital's Emergency Department with bronchiolitis between January-June 2014. A paired pediatric attending and resident evaluated each patient. Evaluation included age-based respiratory rate (RR), retractions, peripheral saturation, and auscultation. Cohen's kappa (K) measured inter-rater agreement. Inter-rater reliability (IRR) was assessed using a one-way random, average measures intra-class correlation (ICC) to evaluate the degree of consistency and magnitude of disagreement between inter-raters. Value of >0.6 was considered substantial for kappa and good internal consistency for ICC.

Results: Twenty patients were evaluated. Analysis showed fair agreement for the presence of retractions (K=0.31), auscultation (K=0.33), and total score (K=0.3). The RR (ICC=0.97), SpO2 (ICC=1.0), auscultation (ICC=0.77), and total score (ICC=0.84) were scored similarly across both raters, indicating excellent IRR. Identification of retractions had the least agreement across all statistical analysis.

Conclusion: The use of a standardized instrument, in conjunction with a trained resident-teaching staff, can help identify deficiencies in clinical competencies among residents and facilitate the learning process for the identification of pertinent clinical findings.

使用改良的细支气管炎评分评估住院医师的诊断技能。
背景:住院医师里程碑是评估住院医师成长、知识进步和临床诊断推理的客观工具;但它们依赖于主治医师的主观评价。很少有人知道使用标准化的仪器,可以补充评估住院医生的诊断技能在学术设置。目的:通过评估儿科主治医师和儿科住院医师之间的变异性和可靠性,评估一种改进的细支气管炎严重程度评估工具。方法:对2014年1月至6月在社区医院急诊科就诊的24月龄以下毛细支气管炎患儿进行横断面研究。一对儿科主治医师和住院医师对每位患者进行了评估。评估包括基于年龄的呼吸频率(RR)、收缩、外周饱和度和听诊。科恩的kappa (K)衡量了评分者之间的一致性。评估者间信度(IRR)采用单向随机,平均测量类内相关性(ICC)来评估评估者间的一致性程度和分歧程度。kappa的值>0.6被认为是实质性的,ICC的内部一致性很好。结果:对20例患者进行了评估。分析结果显示,在存在缩回(K=0.31)、听诊(K=0.33)和总评分(K=0.3)方面存在相当一致的结果。RR (ICC=0.97)、SpO2 (ICC=1.0)、听诊(ICC=0.77)和总分(ICC=0.84)在两个评分者之间的评分相似,表明IRR很好。在所有统计分析中,对撤稿的识别一致性最低。结论:使用标准化仪器,配合训练有素的住院医师教学人员,可以帮助识别住院医师临床能力的不足,并促进识别相关临床发现的学习过程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信