{"title":"重要的不是你做了什么,而是你做事的方式","authors":"J. Gray","doi":"10.1258/JICP.2008.008006","DOIUrl":null,"url":null,"abstract":"As clearly demonstrated in Dr Smith’s Letter to the Editor published in this issue of Journal of Integrated Care Pathways, just having a care pathway in existence, however well-designed, is not enough to ensure improved care to the patient. Over the past 20 years, I have seen innumerable examples of care pathways that appear well-designed from the perspectives of both content and layout, having little effect either way on the care delivered. On the other hand, I have also seen as many care pathways that by anyone’s standards appear poor, but that have been embraced by the local team and that have had a significant, measurable effect on the quality and efficiency of processes and the outcomes of care. This poses the question ‘does a care pathway have any real impact on improving care, and if so, what determines its effectiveness?’ Part of the answer lies in common sense. The most perfectly designed care pathway, if little understood and poorly used, can hardly be expected to make any difference to anything. On the other hand, a care pathway thoughtfully designed with the involvement of those whowill use it, that seeks to ease, coordinate and streamline the provision of the best possible care, and that provides relevant, regular and well-targeted feedback to inform and interest those same people, has far more chance of having an impact on process and outcomes. I have recently been contacted by a Publishing Director who is interested in the better understanding of what is ‘good practice’ when it comes to reviewing pathways. To date, a surprisingly little amount of effort or research has gone into this area. Some examples of pathway audit tools that consider issues such as the content and layout of care pathway tools and the mechanisms of organizing care include: the Clinical Path Assessment developed in the late-1990s by the Centre for Case Management (USA); the ‘badge of quality’; an integrated care pathways appraisal tool developed in 2002 by De Luc et al.; the Integrated Care Pathway Appraisal Tool (ICPAT) developed in 1999 by Wittle et al.; with the support of the Partnership for Developing Quality, West Midlands Regional Levy Board; the ICP Key Elements Checklist developed in 2004 by Croucher as part of a Masters thesis; and the Care Process Self Evaluation Tool (CPSET) developed between 2004 and 2007 by Vanhaecht as part of a thesis to obtain the degree of Doctor in Social Health Sciences. Venture Training & Consulting has developed and used two Care Pathway Quality Scorecards as an exercise over the past 10 years to help teams to ‘know a good care pathway when they see one’ and to decide what they want out of the care pathway that they plan to develop locally. However, none of these tools fully address the relationship between key characteristics of the care pathway and successful implementation. It is certainly possible to teach and to recognize quality content and good design of a care pathway. This supports a growing view that nationally developed and accredited, high-level care pathway maps/ algorithms and supporting care pathway documents, decision scorecards, guides, etc., that can be adapted and built upon for local use, are a valuable starting point. These high-level care pathways are in the most part uncontentious and can provide local teams with the information and confidence that they are implementing the nationally agreed key elements of evidence-based best practice. Guidelines, protocols and initiatives such as the UK Standards for Better Health and Care Bundles can be incorporated to inform evidence-based best practice. Variation can Jenny Gray MCSP SRP Grad Dip Phys, Managing Director, Venture Training & Consulting, Manor Farm Barns, Selsey Road, Donnington, Chichester, West Sussex PO20 7PL, UK.","PeriodicalId":332790,"journal":{"name":"Journal of Integrated Care Pathways","volume":"71 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"It's not what you do, it's the way that you do it\",\"authors\":\"J. Gray\",\"doi\":\"10.1258/JICP.2008.008006\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As clearly demonstrated in Dr Smith’s Letter to the Editor published in this issue of Journal of Integrated Care Pathways, just having a care pathway in existence, however well-designed, is not enough to ensure improved care to the patient. Over the past 20 years, I have seen innumerable examples of care pathways that appear well-designed from the perspectives of both content and layout, having little effect either way on the care delivered. On the other hand, I have also seen as many care pathways that by anyone’s standards appear poor, but that have been embraced by the local team and that have had a significant, measurable effect on the quality and efficiency of processes and the outcomes of care. This poses the question ‘does a care pathway have any real impact on improving care, and if so, what determines its effectiveness?’ Part of the answer lies in common sense. The most perfectly designed care pathway, if little understood and poorly used, can hardly be expected to make any difference to anything. On the other hand, a care pathway thoughtfully designed with the involvement of those whowill use it, that seeks to ease, coordinate and streamline the provision of the best possible care, and that provides relevant, regular and well-targeted feedback to inform and interest those same people, has far more chance of having an impact on process and outcomes. I have recently been contacted by a Publishing Director who is interested in the better understanding of what is ‘good practice’ when it comes to reviewing pathways. To date, a surprisingly little amount of effort or research has gone into this area. Some examples of pathway audit tools that consider issues such as the content and layout of care pathway tools and the mechanisms of organizing care include: the Clinical Path Assessment developed in the late-1990s by the Centre for Case Management (USA); the ‘badge of quality’; an integrated care pathways appraisal tool developed in 2002 by De Luc et al.; the Integrated Care Pathway Appraisal Tool (ICPAT) developed in 1999 by Wittle et al.; with the support of the Partnership for Developing Quality, West Midlands Regional Levy Board; the ICP Key Elements Checklist developed in 2004 by Croucher as part of a Masters thesis; and the Care Process Self Evaluation Tool (CPSET) developed between 2004 and 2007 by Vanhaecht as part of a thesis to obtain the degree of Doctor in Social Health Sciences. Venture Training & Consulting has developed and used two Care Pathway Quality Scorecards as an exercise over the past 10 years to help teams to ‘know a good care pathway when they see one’ and to decide what they want out of the care pathway that they plan to develop locally. However, none of these tools fully address the relationship between key characteristics of the care pathway and successful implementation. It is certainly possible to teach and to recognize quality content and good design of a care pathway. This supports a growing view that nationally developed and accredited, high-level care pathway maps/ algorithms and supporting care pathway documents, decision scorecards, guides, etc., that can be adapted and built upon for local use, are a valuable starting point. These high-level care pathways are in the most part uncontentious and can provide local teams with the information and confidence that they are implementing the nationally agreed key elements of evidence-based best practice. Guidelines, protocols and initiatives such as the UK Standards for Better Health and Care Bundles can be incorporated to inform evidence-based best practice. Variation can Jenny Gray MCSP SRP Grad Dip Phys, Managing Director, Venture Training & Consulting, Manor Farm Barns, Selsey Road, Donnington, Chichester, West Sussex PO20 7PL, UK.\",\"PeriodicalId\":332790,\"journal\":{\"name\":\"Journal of Integrated Care Pathways\",\"volume\":\"71 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Integrated Care Pathways\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1258/JICP.2008.008006\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Integrated Care Pathways","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1258/JICP.2008.008006","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
摘要
正如史密斯博士在本期《综合护理途径杂志》上发表的《致编辑的信》中明确指出的那样,仅仅有一条护理途径,无论设计得多么好,都不足以确保改善对病人的护理。在过去的20年里,我看到了无数的例子,从内容和布局的角度来看,护理路径似乎设计得很好,但对所提供的护理几乎没有任何影响。另一方面,我也见过许多按照任何人的标准来看都很差的护理途径,但它们被当地团队所接受,并对护理过程的质量和效率以及结果产生了重大的、可衡量的影响。这就提出了一个问题:“护理途径对改善护理有任何真正的影响吗?如果有,是什么决定了它的有效性?”部分答案在于常识。设计最完美的护理途径,如果不被理解和使用不当,几乎不能指望产生任何影响。另一方面,一个精心设计的护理途径,让使用者参与其中,寻求简化、协调和简化提供尽可能最好的护理,并提供相关的、定期的、有针对性的反馈,让这些人了解并感兴趣,这更有可能对过程和结果产生影响。最近,一位出版总监联系了我,他对更好地理解审查路径时的“良好实践”很感兴趣。到目前为止,在这一领域的努力和研究少得惊人。路径审计工具的一些例子考虑了诸如护理路径工具的内容和布局以及组织护理的机制等问题,包括:由病例管理中心(美国)在20世纪90年代末开发的临床路径评估;“质量徽章”;2002年由De Luc等人开发的综合护理路径评估工具;由little等人于1999年开发的综合护理路径评估工具(ICPAT);在西米德兰兹地区征费委员会质量发展伙伴关系的支持下;2004年Croucher作为硕士论文的一部分开发的ICP关键要素清单;以及Vanhaecht在2004年至2007年期间开发的护理过程自我评估工具(CPSET),作为获得社会卫生科学博士学位的论文的一部分。Venture Training & Consulting开发并使用了两种护理路径质量记分卡,作为过去10年的一种练习,帮助团队“看到一个好的护理路径时就知道”,并决定他们计划在当地开发的护理路径中想要什么。然而,这些工具都没有完全解决护理途径的关键特征与成功实施之间的关系。当然,教学和识别高质量的内容和护理路径的良好设计是可能的。这支持了一种日益增长的观点,即国家开发和认可的高水平护理路径地图/算法和支持性护理路径文件、决策记分卡、指南等是一个有价值的起点,可以在当地使用。这些高水平护理途径在很大程度上是没有争议的,可以为地方团队提供信息和信心,使他们相信他们正在实施国家商定的循证最佳实践的关键要素。可纳入《联合王国改善卫生和保健标准》等准则、规程和倡议,为循证最佳做法提供信息。变异可以Jenny Gray MCSP SRP Grad Dip Phys,董事总经理,风险培训和咨询,庄园农场谷仓,多宁顿,奇切斯特,西苏塞克斯po207pl,英国。
As clearly demonstrated in Dr Smith’s Letter to the Editor published in this issue of Journal of Integrated Care Pathways, just having a care pathway in existence, however well-designed, is not enough to ensure improved care to the patient. Over the past 20 years, I have seen innumerable examples of care pathways that appear well-designed from the perspectives of both content and layout, having little effect either way on the care delivered. On the other hand, I have also seen as many care pathways that by anyone’s standards appear poor, but that have been embraced by the local team and that have had a significant, measurable effect on the quality and efficiency of processes and the outcomes of care. This poses the question ‘does a care pathway have any real impact on improving care, and if so, what determines its effectiveness?’ Part of the answer lies in common sense. The most perfectly designed care pathway, if little understood and poorly used, can hardly be expected to make any difference to anything. On the other hand, a care pathway thoughtfully designed with the involvement of those whowill use it, that seeks to ease, coordinate and streamline the provision of the best possible care, and that provides relevant, regular and well-targeted feedback to inform and interest those same people, has far more chance of having an impact on process and outcomes. I have recently been contacted by a Publishing Director who is interested in the better understanding of what is ‘good practice’ when it comes to reviewing pathways. To date, a surprisingly little amount of effort or research has gone into this area. Some examples of pathway audit tools that consider issues such as the content and layout of care pathway tools and the mechanisms of organizing care include: the Clinical Path Assessment developed in the late-1990s by the Centre for Case Management (USA); the ‘badge of quality’; an integrated care pathways appraisal tool developed in 2002 by De Luc et al.; the Integrated Care Pathway Appraisal Tool (ICPAT) developed in 1999 by Wittle et al.; with the support of the Partnership for Developing Quality, West Midlands Regional Levy Board; the ICP Key Elements Checklist developed in 2004 by Croucher as part of a Masters thesis; and the Care Process Self Evaluation Tool (CPSET) developed between 2004 and 2007 by Vanhaecht as part of a thesis to obtain the degree of Doctor in Social Health Sciences. Venture Training & Consulting has developed and used two Care Pathway Quality Scorecards as an exercise over the past 10 years to help teams to ‘know a good care pathway when they see one’ and to decide what they want out of the care pathway that they plan to develop locally. However, none of these tools fully address the relationship between key characteristics of the care pathway and successful implementation. It is certainly possible to teach and to recognize quality content and good design of a care pathway. This supports a growing view that nationally developed and accredited, high-level care pathway maps/ algorithms and supporting care pathway documents, decision scorecards, guides, etc., that can be adapted and built upon for local use, are a valuable starting point. These high-level care pathways are in the most part uncontentious and can provide local teams with the information and confidence that they are implementing the nationally agreed key elements of evidence-based best practice. Guidelines, protocols and initiatives such as the UK Standards for Better Health and Care Bundles can be incorporated to inform evidence-based best practice. Variation can Jenny Gray MCSP SRP Grad Dip Phys, Managing Director, Venture Training & Consulting, Manor Farm Barns, Selsey Road, Donnington, Chichester, West Sussex PO20 7PL, UK.