Abstract

Intended for researchers and clinical leaders, this article suggests that embedded program evaluation is a good fit with the desired features of practice-oriented research. The systematic nature of evaluation that is built into the operational workflow of a practice setting may increase the diversity of methods available to explore processes and outcomes of interest. We propose a novel conceptual framework that uses a human-centered systems lens to foster such embedded evaluation in clinical routine. This approach emphasizes the evaluator-practitioner partnership to build confidence in the bi-directional learning of practice-based evidence with evidence-based practice. The iterative cycles inherent to design thinking are aimed at developing better evaluation questions. The attention to structure and context inherent to systems thinking is intended to support meaningful perspectives in the naturally complex world of health care. Importantly, the combined human-centered systems lens can create greater awareness of the influence of individual and systemic biases that exist in any endeavor or institution that involves people. Recommended tools and strategies include systems mapping, program theory development, and visual facilitation using a logic model to represent the complexity of mental health treatment for communication, shared understanding, and connection to the broader evidence base. To illustrate elements of the proposed conceptual framework, two case examples are drawn from routine outcome monitoring (ROM) and progress feedback. We conclude with questions for future collaboration and research that may strengthen the partnership of evaluators and practitioners as a community of learners in service of local and system-level improvement.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call