Abstract

Abstract If we want to understand what works in studies of teacher education programs, we also need to understand what does not work. In this article, we discuss why a study evaluating the effects of an education program on implementation practices yielded unexpected results. Interviews with a sample of teacher graduates from the program revealed that the program did have effects on implementation practices that were not evident in the original study. These effects are in the form of increased student participation, teamwork and the conception of error as opportunity. The instrument and procedures of the original study did not allow these effects to be seen. The impact sheet to this article can be accessed at 10.6084/m9.figshare.22339567.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call