Abstract

As clearly demonstrated in Dr Smith’s Letter to the Editor published in this issue of Journal of Integrated Care Pathways, just having a care pathway in existence, however well-designed, is not enough to ensure improved care to the patient. Over the past 20 years, I have seen innumerable examples of care pathways that appear well-designed from the perspectives of both content and layout, having little effect either way on the care delivered. On the other hand, I have also seen as many care pathways that by anyone’s standards appear poor, but that have been embraced by the local team and that have had a significant, measurable effect on the quality and efficiency of processes and the outcomes of care. This poses the question ‘does a care pathway have any real impact on improving care, and if so, what determines its effectiveness?’ Part of the answer lies in common sense. The most perfectly designed care pathway, if little understood and poorly used, can hardly be expected to make any difference to anything. On the other hand, a care pathway thoughtfully designed with the involvement of those whowill use it, that seeks to ease, coordinate and streamline the provision of the best possible care, and that provides relevant, regular and well-targeted feedback to inform and interest those same people, has far more chance of having an impact on process and outcomes. I have recently been contacted by a Publishing Director who is interested in the better understanding of what is ‘good practice’ when it comes to reviewing pathways. To date, a surprisingly little amount of effort or research has gone into this area. Some examples of pathway audit tools that consider issues such as the content and layout of care pathway tools and the mechanisms of organizing care include: the Clinical Path Assessment developed in the late-1990s by the Centre for Case Management (USA); the ‘badge of quality’; an integrated care pathways appraisal tool developed in 2002 by De Luc et al.; the Integrated Care Pathway Appraisal Tool (ICPAT) developed in 1999 by Wittle et al.; with the support of the Partnership for Developing Quality, West Midlands Regional Levy Board; the ICP Key Elements Checklist developed in 2004 by Croucher as part of a Masters thesis; and the Care Process Self Evaluation Tool (CPSET) developed between 2004 and 2007 by Vanhaecht as part of a thesis to obtain the degree of Doctor in Social Health Sciences. Venture Training & Consulting has developed and used two Care Pathway Quality Scorecards as an exercise over the past 10 years to help teams to ‘know a good care pathway when they see one’ and to decide what they want out of the care pathway that they plan to develop locally. However, none of these tools fully address the relationship between key characteristics of the care pathway and successful implementation. It is certainly possible to teach and to recognize quality content and good design of a care pathway. This supports a growing view that nationally developed and accredited, high-level care pathway maps/ algorithms and supporting care pathway documents, decision scorecards, guides, etc., that can be adapted and built upon for local use, are a valuable starting point. These high-level care pathways are in the most part uncontentious and can provide local teams with the information and confidence that they are implementing the nationally agreed key elements of evidence-based best practice. Guidelines, protocols and initiatives such as the UK Standards for Better Health and Care Bundles can be incorporated to inform evidence-based best practice. Variation can Jenny Gray MCSP SRP Grad Dip Phys, Managing Director, Venture Training & Consulting, Manor Farm Barns, Selsey Road, Donnington, Chichester, West Sussex PO20 7PL, UK.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call