Abstract

BackgroundIn several countries, attempts are made to improve health promotion by centrally rating the effectiveness of health promotion interventions. The Dutch Effectiveness Rating System (ERS) for health promotion interventions is an improvement-oriented approach in which multi-disciplinary expert committees rate available health promotion interventions as ‘theoretically sound’, ‘probably effective’ or ‘proven effective’. The aim of this study is to explore the functioning of the ERS and the perspective of researchers, policy-makers and practitioners regarding its contribution to improvement.MethodsWe interviewed 53 selected key informants from research, policy and practice in the Netherlands and observed the assessment of 12 interventions.ResultsBetween 2008 and 2012, a total of 94 interventions were submitted to the ERS, of which 23 were rejected, 58 were rated as ‘theoretically sound’, 10 were rated as ‘probably effective’ and 3 were rated as ‘proven effective’. According to participants, the ERS was intended to facilitate both the improvement of available interventions and the improvement of health promotion in practice. While participants expected that describing and rating interventions promoted learning and enhanced the transferability of interventions, they were concerned that the ERS approach was not suitable for guiding intervention development and improving health promotion in practice. The expert committees that assessed the interventions struggled with a lack of norms for the relevance of effects and questions about how effects should be studied and rated. Health promotion practitioners were concerned that the ERS neglected the local adaptation of interventions and did not encourage the improvement of aspects like applicability and costs. Policy-makers and practitioners were worried that the lack of proven effectiveness legitimised cutbacks rather than learning and advancing health promotion.ConclusionWhile measuring and centrally rating the effectiveness of interventions can be beneficial, the evidence based-inspired ERS approach is too limited to guide both intervention development and the improvement of health promotion in practice. To better contribute to improving health promotion, a more reflexive and responsive guidance approach is required, namely one which stimulates the improvement of different intervention aspects, provides targeted recommendations to practitioners and provides feedback to those who develop and rate interventions.

Highlights

  • In several countries, attempts are made to improve health promotion by centrally rating the effectiveness of health promotion interventions

  • Health promotion practitioners were concerned that the Effectiveness Rating System (ERS) neglected the local adaptation of interventions and did not encourage the improvement of aspects like applicability and costs, which they deemed important

  • While participants expected that the describing and rating of interventions promoted learning and enhanced the transferability of interventions, they were concerned that the ERS approach was not suitable for guiding intervention development and improving health promotion in practice

Read more

Summary

Introduction

Attempts are made to improve health promotion by centrally rating the effectiveness of health promotion interventions. Central rating of the effectiveness of interventions by experts can help health workers, policy-makers and others benefit from intervention development, research and appraisal work that has been carried out elsewhere and enhance the efficiency, effectiveness and legitimacy of health promotion [5, 6]. It can create an arena for the articulation of standards and the sharing and integration of knowledge and may thereby facilitate learning in the health promotion system [9]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call