Abstract

Many evaluation approaches do not account for temporality and complexity. This thesis is a methodological examination of evaluation techniques based on a case study, which was designed with a dual purpose: first, to evaluate a sexuality education programme with a focus on intimate partner violence (IPV) prevention in Mexico City; and second, to provide empirical data for the thesis. The case study was an evaluation with a longitudinal quasi-experimental design. Data collection methods were semi-structured in-depth interviews, focus group discussions, self-administered questionnaires, and observations of the intervention. I used thematic analysis to examine intervention effects. The methodological exploration used a qualitative observational design based on the case study, exploring questions about the utility of qualitative longitudinal and complex adaptive systems approaches in evaluation and how qualitative and quantitative approaches to data collection compare. Evaluation data collected in Mexico served as raw data, and I wrote fieldnotes about the evaluation process. I used framework analysis, applied a complex systems approach, compared data collected through different methods, and identified barriers to high-quality data. In the evaluation, we found evidence that the intervention contributed to changes in beliefs, intentions, and behaviours related to gender, sexuality, and IPV. The methodological analysis showed that repeat interviews illuminated how the intervention influenced relationship trajectories and provided contextualised data about lived experiences. A complex adaptive systems approach helped us examine system-disruptive elements of the intervention. Challenges to data collection included earthquake-related delays, social complexities, the shifting nature of relationship experiences, and variability in motivation to participate in the study. A reflexive discussion of such barriers to high quality data should inform interpretation of research findings. I argue that evaluation methods should be designed to engage with unpredictability, interaction, temporality and change and should centre on building contextualised understanding of pathways to impact. Evaluations should engage stakeholders and beneficiaries to ensure relevant research questions and define what ‘meaningful’ evidence entails; this will facilitate utilisation of findings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call