Abstract

After analyzing longitudinal anthropometric data from eight community-level nutrition programs to determine their impact, it was concluded that conventional approaches to analysis do not eliminate indeterminacy because: (1) the data was inaccurate or inconsistent, (2) the measures or measurement methods produced misleading results, and, most importantly, (3) a lack of information about the local context of the interventions precluded the elimination of competing explanations of observed outcomes. In that analysis, as with most similar analyses, the traditional experimental approach (applying a predesigned experiment using controls in a presumably constant environment) failed because the experimental context was unstable, unpredictable, and unique in each case. Furthermore, the instability, unpredictability, and uniqueness of each case called for a flexible intervention strategy to cope with the changing context. As an alternative approach to both analysis and intervention, reflection-in-action is proposed. Six features of this model are: explicit specification of the framework underlying the intervention strategy; continuous monitoring of both data gathering procedures and intervention strategies; periodic redesign of those procedures and strategies; collaboration between researchers; practitioners, and subjects throughout; use of on-the-spot experimentation to test particular hypotheses: and explicit enumeration and accounting for potential factors confounding both the analysis and the intervention itself. By actively using the data for continuous monitoring, field practitioners, working with analytic specialists, are more likely to reduce or eliminate indeterminacy due to inaccurate data and/or contextual changes than would traditional researchers or evaluators who maintain distance between themselves and the intervention. Reflection-in-action, in part, is illustrated in the context of a recent evaluation conducted in Sri Lanka where a revisit to the field with preliminary quantitative results caused modification in the interpretation of those results. Problems remain, however, in achieving full implementation of this approach. Practitioners and scientists will have to change their attitudes and behavior to accomodate R-I-A, the role of quantitative analysis in program management and evaluation will have to be placed in proper perspective, and institutions supporting intervention activities will have to modify their approach to both funding and evaluation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call