Abstract

As global health programs have become increasingly complex, corresponding evaluations must be designed to assess the full complexity of these programs. Gavi and the Global Fund have commissioned 2 such evaluations to assess the full spectrum of their investments using a prospective mixed-methods approach. We aim to describe lessons learned from implementing these evaluations. This article presents a synthesis of lessons learned based on the Gavi and Global Fund prospective mixed-methods evaluations, with each evaluation considered a case study. The lessons are based on the evaluation team's experience from over 7 years (2013-2020) implementing these evaluations. The Centers for Disease Control and Prevention Framework for Evaluation in Public Health was used to ground the identification of lessons learned. We identified 5 lessons learned that build on existing evaluation best practices and include a mix of practical and conceptual considerations. The lessons cover the importance of (1) including an inception phase to engage stakeholders and inform a relevant, useful evaluation design; (2) aligning on the degree to which the evaluation is embedded in the program implementation; (3) monitoring programmatic, organizational, or contextual changes and adapting the evaluation accordingly; (4) hiring evaluators with mixed-methods expertise and using tools and approaches that facilitate mixing methods; and (5) contextualizing recommendations and clearly communicating their underlying strength of evidence. Global health initiatives, particularly those leveraging complex interventions, should consider embedding evaluations to understand how and why the programs are working. These initiatives can learn from the lessons presented here to inform the design and implementation of such evaluations.

Highlights

  • As global health programs have become increasingly complex, corresponding evaluations must be designed to assess the full complexity of these programs

  • Insights came primarily from individuals who were involved in the implementation of the evaluation, both global evaluation partner (GEP) (PATH and Institute for Health Metrics and Evaluation (IHME)) that oversaw the evaluations and conducted www.ghspjournal.org cross-country synthesis, and country evaluation partner (CEP) that were primarily responsible for data collection, analysis, and reporting in their country

  • The CEPs included research organizations, academic institutions, and nonprofit organizations based in the focus countries for each evaluation (FCE: Bangladesh, Mozambique, Uganda, Zambia; Prospective Country Evaluation (PCE): Democratic Republic of the Congo, Guatemala, Senegal, Uganda)

Read more

Summary

Introduction

As global health programs have become increasingly complex, corresponding evaluations must be designed to assess the full complexity of these programs. Conclusion: Global health initiatives, those leveraging complex interventions, should consider embedding evaluations to understand how and why the programs are working These initiatives can learn from the lessons presented here to inform the design and implementation of such evaluations. Evaluations may need to consider programmatic outcomes, Lessons Learned From Mixed-Methods Evaluations www.ghspjournal.org and other outputs and outcomes across the system to understand how to improve programs to achieve impact. The goal of these evaluations is to understand what happened as a result of the program, but crucially why the change occurred. We draw from Ozawa and Pongpirul,[8] who define these approaches as evaluations that “intentionally integrate or combine quantitative and qualitative data to maximize the strengths of each, to answer questions that are inadequately answered by one approach.”

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call