Effectiveness of continuing medical education.

Spyridon S Marinopoulos, Todd Dorman, Neda Ratanawongsa, Lisa M Wilson, Bimal H Ashar, Jeffrey L Magaziner, Redonda G Miller, Patricia A Thomas, Gregory P Prokopowicz, Rehan Qayyum, Eric B Bass
{"title":"Effectiveness of continuing medical education.","authors":"Spyridon S Marinopoulos,&nbsp;Todd Dorman,&nbsp;Neda Ratanawongsa,&nbsp;Lisa M Wilson,&nbsp;Bimal H Ashar,&nbsp;Jeffrey L Magaziner,&nbsp;Redonda G Miller,&nbsp;Patricia A Thomas,&nbsp;Gregory P Prokopowicz,&nbsp;Rehan Qayyum,&nbsp;Eric B Bass","doi":"","DOIUrl":null,"url":null,"abstract":"<p><strong>Objectives: </strong>Despite the broad range of continuing medical education (CME) offerings aimed at educating practicing physicians through the provision of up-to-date clinical information, physicians commonly overuse, under-use, and misuse therapeutic and diagnostic interventions. It has been suggested that the ineffective nature of CME either accounts for the discrepancy between evidence and practice or at a minimum contributes to this gap. Understanding what CME tools and techniques are most effective in disseminating and retaining medical knowledge is critical to improving CME and thus diminishing the gap between evidence and practice. The purpose of this review was to comprehensively and systematically synthesize evidence regarding the effectiveness of CME and differing instructional designs in terms of knowledge, attitudes, skills, practice behavior, and clinical practice outcomes.</p><p><strong>Review methods: </strong>We formulated specific questions with input from external experts and representatives of the Agency for Healthcare Research and Quality (AHRQ) and the American College of Chest Physicians (ACCP) which nominated this topic. We systematically searched the literature using specific eligibility criteria, hand searching of selected journals, and electronic databases including: MEDLINE, EMBASE, the Cochrane Database of Systematic Reviews, The Cochrane Central Register of Controlled Trials (CENTRAL), the Cochrane Database of Abstracts of Reviews of Effects (DARE), PsycINFO, and the Educational Resource Information Center (ERIC). Two independent reviewers conducted title scans, abstract reviews, and then full article reviews to identify eligible articles. Each eligible article underwent double review for data abstraction and assessment of study quality.</p><p><strong>Results: </strong>Of the 68,000 citations identified by literature searching, 136 articles and 9 systematic reviews ultimately met our eligibility criteria. The overall quality of the literature was low and consequently firm conclusions were not possible. Despite this, the literature overall supported the concept that CME was effective, at least to some degree, in achieving and maintaining the objectives studied, including knowledge (22 of 28 studies), attitudes (22 of 26), skills (12 of 15), practice behavior (61 of 105), and clinical practice outcomes (14 of 33). Common themes included that live media was more effective than print, multimedia was more effective than single media interventions, and multiple exposures were more effective than a single exposure. The number of articles that addressed internal and/or external characteristics of CME activities was too small and the studies too heterogeneous to determine if any of these are crucial for CME success. Evidence was limited on the reliability and validity of the tools that have been used to assess CME effectiveness. Based on previous reviews, the evidence indicates that simulation methods in medical education are effective in the dissemination of psychomotor and procedural skills.</p><p><strong>Conclusions: </strong>Despite the low quality of the evidence, CME appears to be effective at the acquisition and retention of knowledge, attitudes, skills, behaviors and clinical outcomes. More research is needed to determine with any degree of certainty which types of media, techniques, and exposure volumes as well as what internal and external audience characteristics are associated with improvements in outcomes.</p>","PeriodicalId":72991,"journal":{"name":"Evidence report/technology assessment","volume":" 149","pages":"1-69"},"PeriodicalIF":0.0000,"publicationDate":"2007-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4781050/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Evidence report/technology assessment","FirstCategoryId":"1085","ListUrlMain":"","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Objectives: Despite the broad range of continuing medical education (CME) offerings aimed at educating practicing physicians through the provision of up-to-date clinical information, physicians commonly overuse, under-use, and misuse therapeutic and diagnostic interventions. It has been suggested that the ineffective nature of CME either accounts for the discrepancy between evidence and practice or at a minimum contributes to this gap. Understanding what CME tools and techniques are most effective in disseminating and retaining medical knowledge is critical to improving CME and thus diminishing the gap between evidence and practice. The purpose of this review was to comprehensively and systematically synthesize evidence regarding the effectiveness of CME and differing instructional designs in terms of knowledge, attitudes, skills, practice behavior, and clinical practice outcomes.

Review methods: We formulated specific questions with input from external experts and representatives of the Agency for Healthcare Research and Quality (AHRQ) and the American College of Chest Physicians (ACCP) which nominated this topic. We systematically searched the literature using specific eligibility criteria, hand searching of selected journals, and electronic databases including: MEDLINE, EMBASE, the Cochrane Database of Systematic Reviews, The Cochrane Central Register of Controlled Trials (CENTRAL), the Cochrane Database of Abstracts of Reviews of Effects (DARE), PsycINFO, and the Educational Resource Information Center (ERIC). Two independent reviewers conducted title scans, abstract reviews, and then full article reviews to identify eligible articles. Each eligible article underwent double review for data abstraction and assessment of study quality.

Results: Of the 68,000 citations identified by literature searching, 136 articles and 9 systematic reviews ultimately met our eligibility criteria. The overall quality of the literature was low and consequently firm conclusions were not possible. Despite this, the literature overall supported the concept that CME was effective, at least to some degree, in achieving and maintaining the objectives studied, including knowledge (22 of 28 studies), attitudes (22 of 26), skills (12 of 15), practice behavior (61 of 105), and clinical practice outcomes (14 of 33). Common themes included that live media was more effective than print, multimedia was more effective than single media interventions, and multiple exposures were more effective than a single exposure. The number of articles that addressed internal and/or external characteristics of CME activities was too small and the studies too heterogeneous to determine if any of these are crucial for CME success. Evidence was limited on the reliability and validity of the tools that have been used to assess CME effectiveness. Based on previous reviews, the evidence indicates that simulation methods in medical education are effective in the dissemination of psychomotor and procedural skills.

Conclusions: Despite the low quality of the evidence, CME appears to be effective at the acquisition and retention of knowledge, attitudes, skills, behaviors and clinical outcomes. More research is needed to determine with any degree of certainty which types of media, techniques, and exposure volumes as well as what internal and external audience characteristics are associated with improvements in outcomes.

继续医学教育的有效性。
目标:尽管提供了广泛的继续医学教育(CME),旨在通过提供最新的临床信息来教育执业医生,但医生通常过度使用、使用不足和误用治疗和诊断干预措施。有人认为,CME的无效性质要么解释了证据与实践之间的差异,要么至少促成了这一差距。了解哪些持续医学教育工具和技术在传播和保留医学知识方面最有效,对于改善持续医学教育,从而缩小证据与实践之间的差距至关重要。本综述的目的是全面、系统地综合有关持续医学教育和不同教学设计在知识、态度、技能、实践行为和临床实践结果方面的有效性的证据。回顾方法:我们根据外部专家以及提名本课题的美国卫生保健研究与质量机构(AHRQ)和美国胸科医师学会(ACCP)代表的意见制定了具体问题。我们使用特定的资格标准、手工检索选定的期刊和电子数据库系统地检索文献,包括:MEDLINE、EMBASE、Cochrane系统评价数据库、Cochrane中央对照试验注册库(Central)、Cochrane效果评价摘要数据库(DARE)、PsycINFO和教育资源信息中心(ERIC)。两名独立审稿人进行了标题扫描、摘要审评和全文审评,以确定符合条件的文章。每篇符合条件的文章都进行了数据提取和研究质量评估的双重审查。结果:在文献检索确定的68,000条引用中,136篇文章和9篇系统评价最终符合我们的资格标准。文献的整体质量较低,因此不可能得出确切的结论。尽管如此,文献总体上支持CME是有效的概念,至少在某种程度上,在实现和维持研究目标方面,包括知识(28项研究中的22项),态度(26项中的22项),技能(15项中的12项),实践行为(105项中的61项)和临床实践结果(33项中的14项)。共同的主题包括现场媒体比印刷媒体更有效,多媒体比单一媒体干预更有效,多次曝光比单一曝光更有效。讨论CME活动的内部和/或外部特征的文章数量太少,研究也太过多样化,无法确定这些因素是否对CME的成功至关重要。用于评估CME有效性的工具的可靠性和有效性证据有限。基于以往的回顾,证据表明,模拟方法在医学教育中是有效的精神运动和程序技能的传播。结论:尽管证据质量较低,CME在知识、态度、技能、行为和临床结果的获取和保留方面似乎是有效的。需要更多的研究来确定哪种类型的媒体、技术和曝光量,以及内部和外部受众的哪些特征与结果的改善有关。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信