实施的悖论:没有影响的工具——来自全球证据峰会的反思

Laura Mora Moreo
{"title":"实施的悖论:没有影响的工具——来自全球证据峰会的反思","authors":"Laura Mora Moreo","doi":"10.1002/gin2.70010","DOIUrl":null,"url":null,"abstract":"<p>The Global Evidence Summit (GES2024), held in Prague from September 9 to 13, 2024, united scholars, healthcare professionals, and policymakers to tackle global health challenges. Among the diverse topics presented, implementation science stood out, emphasising collaboration to bridge evidence and practice gaps while fostering discussions on improving health systems and outcomes. Notably, many talks and posters focused on implementing clinical guideline recommendations—a critical and growing area of interest. However, a paradox remains: <i>whilst the development of implementation tools is on the rise, their actual influence on clinical outcomes remains largely unquantified</i>.</p><p>The GES2024 presented numerous presentations on digital technologies, decision-support systems, and other advanced tools for disseminating and implementing clinical guidelines. For example, one exhibit showcased a mobile app designed to integrate palliative care guidelines into daily practice through a web-based platform for healthcare professionals. Despite their popularity and high levels of user engagement, evidence of these innovations' effectiveness in improving patient outcomes was limited. Many implementation tools gauge success through superficial metrics like downloads or user engagement, which primarily reflect conceptual use,<span><sup>1</sup></span> while these metrics indicate awareness or understanding, they rarely lead to the behavioural or process changes characteristic of instrumental use. For meaningful and sustained impact, such efforts must progress beyond these surrogate outcomes to measurable clinical outcomes that directly improve patient care or system efficiency, aligning with the goals of knowledge translation and implementation science. For example, the palliative care platform received over 100,000 visits in its first 6 months, but data on patient impact were lacking. Measuring app downloads is different from evaluating their effect on patient outcomes. While high engagement numbers suggest interest, they do not indicate whether the guidelines improved patient care or symptom management.</p><p>This raises a critical question: <b>Are implementers failing to recognise the importance of measuring the true impact of these tools, or are clinical guidelines recommendations inherently difficult to assess in real-world practice?</b></p><p>The literature emphasises that focusing only on engagement without assessing clinical outcomes undermines the main goal of guidelines—improving patient care. There is still a significant lack of information about the effective implementation of these recommendations. Despite existing frameworks, there is a shortage of evidence about real-world application and effectiveness. Research shows a persistent gap between creating guidelines and putting them into practice, often worsened by inconsistent adaptation across different contexts. This underscores the need for strong research methods to determine whether guidelines genuinely improve clinical practice and patient outcomes rather than relying solely on their potential impact.<span><sup>2-4</sup></span></p><p>Successful implementation of clinical tools requires more than availability; it demands integration into routine practice, alongside continuous evaluation. The lack of data on how well these tools are embedded in healthcare demonstrates the need for comprehensive assessments of implementation outcomes. Guideline recommendations are complex to implement, and the strategies should integrate several approaches. Technological tools can be useful, and their effectiveness has been proven in the past,<span><sup>5</sup></span> but it's important to consider that they are only a small piece of the puzzle. The hindrances behind a good implementation process are varied and depend on the specific context targeted for implementation.<span><sup>6, 7</sup></span></p><p>The discussions at the GES offered a platform to rethink how the implementation of recommendations should be evaluated. This editorial seeks to raise awareness that implementing recommendations extends beyond technical assistance systems. It emphasises the importance of considering the broader context, including local adaptation, sustainability, and real-world outcomes, to ensure that guidelines effectively improve practice and patient care. The implementation process must be viewed as a dynamic, multifaceted endeavour rather than a purely technical task.</p><p>I, Laura Alejandra Mora Moreo, confirm that I am the sole author of this manuscript. I made substantial contributions to the conception and interpretation of the work, drafted the manuscript, and revised it for important intellectual content.</p>","PeriodicalId":100266,"journal":{"name":"Clinical and Public Health Guidelines","volume":"2 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/gin2.70010","citationCount":"0","resultStr":"{\"title\":\"The paradox of implementation: Tools without impact - reflections from the global evidence summit\",\"authors\":\"Laura Mora Moreo\",\"doi\":\"10.1002/gin2.70010\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The Global Evidence Summit (GES2024), held in Prague from September 9 to 13, 2024, united scholars, healthcare professionals, and policymakers to tackle global health challenges. Among the diverse topics presented, implementation science stood out, emphasising collaboration to bridge evidence and practice gaps while fostering discussions on improving health systems and outcomes. Notably, many talks and posters focused on implementing clinical guideline recommendations—a critical and growing area of interest. However, a paradox remains: <i>whilst the development of implementation tools is on the rise, their actual influence on clinical outcomes remains largely unquantified</i>.</p><p>The GES2024 presented numerous presentations on digital technologies, decision-support systems, and other advanced tools for disseminating and implementing clinical guidelines. For example, one exhibit showcased a mobile app designed to integrate palliative care guidelines into daily practice through a web-based platform for healthcare professionals. Despite their popularity and high levels of user engagement, evidence of these innovations' effectiveness in improving patient outcomes was limited. Many implementation tools gauge success through superficial metrics like downloads or user engagement, which primarily reflect conceptual use,<span><sup>1</sup></span> while these metrics indicate awareness or understanding, they rarely lead to the behavioural or process changes characteristic of instrumental use. For meaningful and sustained impact, such efforts must progress beyond these surrogate outcomes to measurable clinical outcomes that directly improve patient care or system efficiency, aligning with the goals of knowledge translation and implementation science. For example, the palliative care platform received over 100,000 visits in its first 6 months, but data on patient impact were lacking. Measuring app downloads is different from evaluating their effect on patient outcomes. While high engagement numbers suggest interest, they do not indicate whether the guidelines improved patient care or symptom management.</p><p>This raises a critical question: <b>Are implementers failing to recognise the importance of measuring the true impact of these tools, or are clinical guidelines recommendations inherently difficult to assess in real-world practice?</b></p><p>The literature emphasises that focusing only on engagement without assessing clinical outcomes undermines the main goal of guidelines—improving patient care. There is still a significant lack of information about the effective implementation of these recommendations. Despite existing frameworks, there is a shortage of evidence about real-world application and effectiveness. Research shows a persistent gap between creating guidelines and putting them into practice, often worsened by inconsistent adaptation across different contexts. This underscores the need for strong research methods to determine whether guidelines genuinely improve clinical practice and patient outcomes rather than relying solely on their potential impact.<span><sup>2-4</sup></span></p><p>Successful implementation of clinical tools requires more than availability; it demands integration into routine practice, alongside continuous evaluation. The lack of data on how well these tools are embedded in healthcare demonstrates the need for comprehensive assessments of implementation outcomes. Guideline recommendations are complex to implement, and the strategies should integrate several approaches. Technological tools can be useful, and their effectiveness has been proven in the past,<span><sup>5</sup></span> but it's important to consider that they are only a small piece of the puzzle. The hindrances behind a good implementation process are varied and depend on the specific context targeted for implementation.<span><sup>6, 7</sup></span></p><p>The discussions at the GES offered a platform to rethink how the implementation of recommendations should be evaluated. This editorial seeks to raise awareness that implementing recommendations extends beyond technical assistance systems. It emphasises the importance of considering the broader context, including local adaptation, sustainability, and real-world outcomes, to ensure that guidelines effectively improve practice and patient care. The implementation process must be viewed as a dynamic, multifaceted endeavour rather than a purely technical task.</p><p>I, Laura Alejandra Mora Moreo, confirm that I am the sole author of this manuscript. I made substantial contributions to the conception and interpretation of the work, drafted the manuscript, and revised it for important intellectual content.</p>\",\"PeriodicalId\":100266,\"journal\":{\"name\":\"Clinical and Public Health Guidelines\",\"volume\":\"2 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-12-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/gin2.70010\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Clinical and Public Health Guidelines\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/gin2.70010\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Clinical and Public Health Guidelines","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/gin2.70010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

全球证据峰会(GES2024)于2024年9月9日至13日在布拉格举行,汇集了学者、卫生保健专业人员和政策制定者,以应对全球卫生挑战。在提出的各种主题中,实施科学脱颖而出,强调合作弥合证据和实践差距,同时促进关于改善卫生系统和成果的讨论。值得注意的是,许多演讲和海报都集中在临床指南建议的实施上——这是一个关键且日益增长的领域。然而,一个矛盾仍然存在:虽然实施工具的发展正在上升,但它们对临床结果的实际影响在很大程度上仍然无法量化。GES2024在数字技术、决策支持系统和其他用于传播和实施临床指南的先进工具方面做了大量报告。例如,一个展览展示了一个移动应用程序,该应用程序旨在通过医疗保健专业人员的网络平台将姑息治疗指南整合到日常实践中。尽管它们很受欢迎,用户参与度也很高,但这些创新在改善患者预后方面的有效性证据有限。许多执行工具通过诸如下载量或用户粘性等肤浅的指标来衡量成功与否,这些指标主要反映概念使用,虽然这些指标表明意识或理解,但它们很少导致工具性使用特征的行为或过程变化。为了产生有意义和持续的影响,这些努力必须超越这些替代结果,取得可衡量的临床结果,直接改善患者护理或系统效率,与知识转化和实施科学的目标保持一致。例如,姑息治疗平台在头6个月接待了超过10万人次的访问,但缺乏关于患者影响的数据。衡量应用下载量不同于评估应用对患者治疗效果的影响。虽然高参与度表明人们对此感兴趣,但这并不能说明指南是否改善了患者护理或症状管理。这就提出了一个关键的问题:是实施者没有认识到衡量这些工具的真正影响的重要性,还是临床指南建议本身就难以在现实世界的实践中进行评估?文献强调,只关注参与而不评估临床结果破坏了指导方针的主要目标——改善患者护理。关于这些建议的有效执行情况的资料仍然严重缺乏。尽管有现有的框架,但缺乏关于实际应用和有效性的证据。研究表明,在制定指导方针和将其付诸实践之间存在着持续的差距,这种差距往往因在不同背景下不一致的适应而恶化。这强调了需要强有力的研究方法来确定指南是否真正改善临床实践和患者结果,而不是仅仅依赖其潜在影响。2-4临床工具的成功实施需要的不仅仅是可用性;它需要整合到日常实践中,以及持续的评估。缺乏关于这些工具在医疗保健中嵌入情况的数据表明,需要对实施结果进行全面评估。指南建议的实施是复杂的,战略应该整合几种方法。技术工具是有用的,它们的有效性在过去得到了证明,但重要的是要考虑到它们只是拼图的一小部分。良好实施过程背后的障碍是多种多样的,取决于具体的实施环境。6,7 . GES的讨论提供了一个平台,可以重新考虑如何评价各项建议的执行情况。这篇社论旨在提高人们的认识,即执行建议的范围超出了技术援助系统。它强调了考虑更广泛背景的重要性,包括当地适应性、可持续性和现实结果,以确保指南有效改善实践和患者护理。必须将执行进程视为一项动态的、多方面的努力,而不是一项纯粹的技术任务。我,Laura Alejandra Mora Moreo,确认我是这份手稿的唯一作者。我对作品的构思和解释做出了实质性的贡献,起草了手稿,并对重要的知识内容进行了修改。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The paradox of implementation: Tools without impact - reflections from the global evidence summit

The Global Evidence Summit (GES2024), held in Prague from September 9 to 13, 2024, united scholars, healthcare professionals, and policymakers to tackle global health challenges. Among the diverse topics presented, implementation science stood out, emphasising collaboration to bridge evidence and practice gaps while fostering discussions on improving health systems and outcomes. Notably, many talks and posters focused on implementing clinical guideline recommendations—a critical and growing area of interest. However, a paradox remains: whilst the development of implementation tools is on the rise, their actual influence on clinical outcomes remains largely unquantified.

The GES2024 presented numerous presentations on digital technologies, decision-support systems, and other advanced tools for disseminating and implementing clinical guidelines. For example, one exhibit showcased a mobile app designed to integrate palliative care guidelines into daily practice through a web-based platform for healthcare professionals. Despite their popularity and high levels of user engagement, evidence of these innovations' effectiveness in improving patient outcomes was limited. Many implementation tools gauge success through superficial metrics like downloads or user engagement, which primarily reflect conceptual use,1 while these metrics indicate awareness or understanding, they rarely lead to the behavioural or process changes characteristic of instrumental use. For meaningful and sustained impact, such efforts must progress beyond these surrogate outcomes to measurable clinical outcomes that directly improve patient care or system efficiency, aligning with the goals of knowledge translation and implementation science. For example, the palliative care platform received over 100,000 visits in its first 6 months, but data on patient impact were lacking. Measuring app downloads is different from evaluating their effect on patient outcomes. While high engagement numbers suggest interest, they do not indicate whether the guidelines improved patient care or symptom management.

This raises a critical question: Are implementers failing to recognise the importance of measuring the true impact of these tools, or are clinical guidelines recommendations inherently difficult to assess in real-world practice?

The literature emphasises that focusing only on engagement without assessing clinical outcomes undermines the main goal of guidelines—improving patient care. There is still a significant lack of information about the effective implementation of these recommendations. Despite existing frameworks, there is a shortage of evidence about real-world application and effectiveness. Research shows a persistent gap between creating guidelines and putting them into practice, often worsened by inconsistent adaptation across different contexts. This underscores the need for strong research methods to determine whether guidelines genuinely improve clinical practice and patient outcomes rather than relying solely on their potential impact.2-4

Successful implementation of clinical tools requires more than availability; it demands integration into routine practice, alongside continuous evaluation. The lack of data on how well these tools are embedded in healthcare demonstrates the need for comprehensive assessments of implementation outcomes. Guideline recommendations are complex to implement, and the strategies should integrate several approaches. Technological tools can be useful, and their effectiveness has been proven in the past,5 but it's important to consider that they are only a small piece of the puzzle. The hindrances behind a good implementation process are varied and depend on the specific context targeted for implementation.6, 7

The discussions at the GES offered a platform to rethink how the implementation of recommendations should be evaluated. This editorial seeks to raise awareness that implementing recommendations extends beyond technical assistance systems. It emphasises the importance of considering the broader context, including local adaptation, sustainability, and real-world outcomes, to ensure that guidelines effectively improve practice and patient care. The implementation process must be viewed as a dynamic, multifaceted endeavour rather than a purely technical task.

I, Laura Alejandra Mora Moreo, confirm that I am the sole author of this manuscript. I made substantial contributions to the conception and interpretation of the work, drafted the manuscript, and revised it for important intellectual content.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信