Abstract

Background: Most funders require non-governmental organisations to evaluate the effectiveness of their programmes. However, in our experience, funders seldom fund evaluation endeavours and organisational staff often lack evaluation skills.Aim: In this outcome evaluation of Living through Learning’s (LTL) class-based English-medium Coronation Reading Adventure Room programme, we addressed two evaluation questions: whether Grade 1 learners who participated in the programme attained LTL’s and the Department of Basic Education’s (DBE) literacy standards at the end of the programme and whether teacher attributes contributed to this improvement.Setting: The evaluation was conducted in 18 different no-fee schools in Cape Town. Participants comprised 1090 Grade 1 learners and 54 teachers.Methods: We used Level 2 (programme design and theory) and part of Level 4 (outcome) of an evaluation hierarchy to assess the effectiveness of the programme.Results: Evaluation results showed that most schools, except three, attained the 60% performance standard set by the LTL on all quarterly assessments. Most schools, except two, attained the 50% performance standard of the DBE for English first language on all quarterly assessments. We also found that in terms of teacher attributes, only teacher experience in literacy teaching was significant in predicting learner performance in literacy in the first term of school.Conclusion: We explain why our results should be interpreted with caution and make recommendations for future evaluations in terms of design, data collection and levels of evaluation.

Highlights

  • Various national and international assessments have shown the poor state of reading ability of South African learners

  • From the results reported in this study, it can be concluded that learners who received the Coronation Reading Adventure Room (CRAR) programme in addition to school-based literacy teaching were able to read and write according to Living through Learning (LTL) and DBE standards at the end of Grade 1

  • When data were disaggregated per school, we found that schools attained the LTL standard in all four terms, while schools significantly exceeded the 50% DBE standard in all four terms

Read more

Summary

Introduction

In our experience, little, if any, funding is provided for monitoring and evaluation of programmes Combined with this lack of funds, we find that while programme staff members show great expertise in the content and implementation of their own programmes, they often lack the necessary monitoring and evaluation skills. Grigg et al (2016) attempted to address the latter shortcoming by proposing core evaluation questions and research designs for evaluating reading programmes These evaluation questions focused on problem definition, theories of change, programme implementation, programme outcomes and impact, and programme cost. In our experience, funders seldom fund evaluation endeavours and organisational staff often lack evaluation skills

Objectives
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call