Abstract

Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about (a) selecting from among various QEDs and (b) developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.

Highlights

  • Public health practice involves implementing or adapting evidence-based interventions into new settings to improve health for individuals and populations. Such interventions typically include one or more of the 7 Ps [7]. Both public health and clinical research have increasingly sought to generate practice-based evidence on a wide range of interventions, which in turn has led to a greater focus on developing intervention research designs that can be applied in real-world settings [2, 7,8,9, 19, 24, 25]

  • Randomized controlled trials (RCTs) in which individuals are assigned to intervention or control arms are considered the gold standard for assessing causality and, as such, are a first choice for most intervention research

  • In addition to addressing the general concerns with RCTs, as discussed above, the advantages of Stepped-wedge design (SWD) include the logistical convenience of staggering the intervention’s rollout, which enables a smaller staff to be distributed across different implementation start times and allows for multilevel interventions to be integrated into practice or real-world settings

Read more

Summary

INTRODUCTION

Public health practice involves implementing or adapting evidence-based interventions into new settings to improve health for individuals and populations. Previous reviews in the Annual Review of Public Health have focused on the importance and use of QEDs and other methods to enhance causal inference when evaluating the impact of an intervention that has already been implemented [4, 7, 8, 16] Design approaches in this case often include creating a post hoc comparison group for a natural experiment or identifying pre- and www.annualreviews.org Selecting and Improving Quasi-Experimental Designs 7. To strengthen the internal validity, they collected additional data that enabled them to (a) determine whether the reduction in malaria rates were most pronounced during the rainy season within the intervention communities, as this was a biologically plausible exposure period in which they could expect the largest effect size difference between intervention and control sites, and (b) examine use patterns for the bed nets, on the basis www.annualreviews.org Selecting and Improving Quasi-Experimental Designs 11

Design strategy to improve internal validity
Design strategy to improve external validity
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.