Abstract
Knowledge of which interventions are more efficacious than others for given problems is central to evidence-based practice. Attempts to build this knowledge have been confined largely to reviews and meta-analyses of experiments comparing methods of psychotherapy. This literature has suggested that different methods tend to have equivalent results. The authors reviewed all experiments comparing 39 social work programs that were published between 1990 and 2001. Contrary to findings for psychotherapy experiments, a large majority of the social work comparisons showed differential effects. The role of common factors appeared to be diminished by departures of most the social work programs from traditional psychotherapeutic models. Lack of statistical power appeared to be influential only in a few small-sample experiments. The findings support the use of comparative experimental designs to strengthen the empirical base of social work practice. Key words: common factors; comparative experiment; efficacy; evidence-based practice; research review ********** Over the past three decades considerable progress has been made in social work and related fields in the identification of interventions whose efficacy has been demonstrated through controlled experimental research (Chambless & Ollendick, 2001; Gorey, 1996; Reid & Fortune, 2003; Wampold, 2001). A significant number of such methods are now available for a wide range of clinical situations. Less progress has been made in determining which interventions may be superior to others for given problems (Drisko, 2002; Wampold). Determination of the relative efficacy of alternative methods, ideally made through randomized comparative designs, is important for several reasons. First, it provides information about which intervention may be preferable given a choice of possibly several that have been determined to be efficacious. Second, an intervention that has been found to be superior to a rival has additional proof of its efficacy, because the comparison controlled for nonspecific effects (for example, the attention of a supportive professional) provided by most treatments. Third, it can provide information about the influence of specific interventions through component designs (Ahn & Wampold, 2001). For example, if intervention x is compared with intervention x + y and the latter intervention leads to a greater margin of efficacy, there is evidence that the y component was responsible for that greater margin. As Chambless and Hollon (1998) put it, Comparisons with other rival interventions ... sit atop a hierarchy of increasing competitive difficulty ... [and] treatments that are found to be superior ... are more highly valued. (p. 8). The need to determine relative efficacy has been heightened by the proliferation of interventions found to surpass untreated controls. If several effective interventions are available for a given problem, evidence-based practitioners want to know which may be more effective. DIFFICULTIES IN DETERMINING DIFFERENTIAL EFFECTIVENESS Questions of differential effectiveness in the human services have been studied the most intensively with respect to psychotherapy. Comparative experiments dating back to the 1930s have compared different forms of psychotherapy, contrasting theoretical orientations (for example, behavioral versus psychodynamic), methods (for example, insight-oriented versus supportive), and structures (for example, individual versus group or family). Research reviews and meta-analyses of these experiments have consistently failed to find much evidence of differential effects (Ahn & Wampold, 2001; Gloaguen, Cottraux, Cucherat, & Blackburn, 1998; Luborsky et al. 1999; Luborsky, Singer, & Luborsky, 1975; Shapiro & Shapiro, 1982; Smith, Glass, & Miller, 1980; Wampold et al., 1997). Some comparative experiments have produced differential effects, mainly in treatment of children and adolescents (Chambless & Ollendick, 2001), but they are the exception rather than the rule. …
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.