AbstractDespite increasing pressure for policy and practice to adopt a more evidence‐based approach, transferring evidence into use remains a stubborn challenge. This is largely due to a number of researcher‐derived and user‐derived barriers at play within institutions, organisations and systems that constrain active engagement with evidence. This paper contributes to current debates on evidence‐use by suggesting that embedded evaluation approaches might overcome such barriers, through the creation of social capital that can be drawn upon by embedded evaluators to: (i) build trust, confidence and understanding around evaluation and evidence, on the part of local practitioners and policy makers; (ii) develop a co‐productive evidence infrastructure that might draw together diverse stakeholders, as well as encourage user‐engagement with varied forms of evaluation data (both numerical data and rich narrative accounts), to capture the richness of the unfolding story of complex educational initiatives. To illustrate this, this paper presents six reflective vignettes that ponder the barriers, and potential responses sought, within a two‐year exploratory case study that aimed to actively engage local education‐related stakeholders, leading a borough‐wide place‐based initiative, with evidence. Data reported within these vignettes were collected through a series of semi‐structured interviews, supplemented by an embedded evaluator's field notes and analysed thematically. Although more research is needed, this paper concludes that the embedded evaluation model might have potential to respond to the diverse challenges associated with evidence‐use, by positioning the evaluator in a relationally elastic way that might enable them to embed evidence‐into‐use pathways within policy and practice both actively and iteratively. Context and implicationsRationale for this studyTo present an empirical example of an embedded evaluation approach that has sought to overcome a range of barriers to evidence‐use.Why the new findings matterResponding to a lack of consensus around the most effective ways to implement evidence into educational practice, the findings suggest potentially significant solutions to established barriers to evidence‐use.Implications forThis paper has implications for programme / project leads of place‐based education improvement strategies—both operationally and strategically—in demonstrating how embedded evaluation might enable more institutionally relevant and longitudinal evaluation agenda and designs to be co‐produced with a range of education‐related stakeholders, including young people.This paper also has implications for university‐based researchers and evaluators. In particular, it demonstrates the possibilities that embedded research and evaluation might pose for bridging between the university as a world‐leading research hub and regional education systems and structures, in ways that place evidence at the heart of local policy and practice.
Read full abstract