Abstract

Everyone wants to know works to help schools improve student achievement. And there are some programs that research tells us in schools. But, when these programs are scaled up and used by large numbers of schools in settings all over the country, the effects are often inconsistent and even disappointing (Elmore 1996). Why is this so often the case? The fact is that in between the program design and the desired student outcomes is the uncertain process of implementation. Too often, program implementation has been treated as an inscrutable period during which forces too numerous to name or analyze cause programs to mutate in unpredictable ways. It's common to hear that a program isn't being implemented with fidelity. Program designers, program implementers, and program evaluators often seem surprised about this lack of fidelity even though, over 30 years ago, we learned that complex programs go through a process of mutual in which both developers and implementers make adjustments to work more effectively (Berman and McLaughlin 1978). Decades of research show that even the most clearly defined programs are unlikely to be implemented in ways that are in perfect consonance with their creators' vision. In fact, one of the most consistent findings from education research is variability in program implementation. Studies of various programs ranging from teacher professional development (Hill 2001; Spillane and Zeuli 1999) to comprehensive school reform (Berends, Bodilly, and Kirby 2002; Rowan, Camburn, and Barnes 2004; Supovitz and Taylor 2005) to specific instructional approaches (Penuel and Means 2004) find that improvement programs are often used inconsistently or in ways their designers had not expected. And, although we don't want to argue that fidelity of implementation is the only thing to worry about in improving educational outcomes, some research suggests that fidelity of implementation is directly related to producing predicted results (Bodilly 1998). As a result, program designers usually see the variability in implementation as a problem to overcome. Programs that are used in ways that aren't consistent with the designers' vision are frequently seen as failures. To help both program designers and school-level implementers avoid the sense of failure, can we predict what parts of a program will stick and what will be changed? Or can we identify the points at which adaptation is likely to take place? A three-year study by the Consortium for Policy Research in Education (CPRE) at the University of Pennsylvania examined these questions. Such information may help create programs that meet their overall goals, even if they don't look exactly the same in every school and classroom. The CPRE study was a longitudinal mixed-method examination of the implementation of five school-improvement programs in 15 high schools. Like many of our predecessors, we found substantial variation in how these programs were implemented (Supovitz and Weinbaum 2008). We theorized that implementation is a process of iterative refraction (Supovitz 2008a). Iterative refraction means reforms are adjusted repeatedly as they're introduced into--and work their way through--school environments. Refraction captures the idea that external reforms are likely to change repeatedly as they filter through multiple layers of the education system, including the district, school, department, team, and classroom. The process is iterative because each level makes decisions about different components of a reform over time. The theory of iterative refraction suggests that implementation may not be as unpredictable as we've been led to believe. Although adjustments are likely to occur at multiple places and repeatedly over time, the implementation process has junctures that can be identified and defined in ways that may increase the predictability of how programs are likely to be used. To achieve higher levels of fidelity, some program designers have sought to be as specific as possible about instructional approaches or organizational changes in schools. …

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call