Abstract

BackgroundDespite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak. We sought to systematically assess coverage of the five EBP steps, review the outcome domains measured, and assess the properties of the instruments used in studies evaluating EBP educational interventions.MethodsWe conducted a systematic review of controlled studies (i.e. studies with a separate control group) which had investigated the effect of EBP educational interventions. We used citation analysis technique and tracked the forward and backward citations of the index articles (i.e. the systematic reviews and primary studies included in an overview of the effect of EBP teaching) using Web of Science until May 2017. We extracted information on intervention content (grouped into the five EBP steps), and the outcome domains assessed. We also searched the literature for published reliability and validity data of the EBP instruments used.ResultsOf 1831 records identified, 302 full-text articles were screened, and 85 included. Of these, 46 (54%) studies were randomised trials, 51 (60%) included postgraduate level participants, and 63 (75%) taught medical professionals. EBP Step 3 (critical appraisal) was the most frequently taught step (63 studies; 74%). Only 10 (12%) of the studies taught content which addressed all five EBP steps. Of the 85 studies, 52 (61%) evaluated EBP skills, 39 (46%) knowledge, 35 (41%) attitudes, 19 (22%) behaviours, 15 (18%) self-efficacy, and 7 (8%) measured reactions to EBP teaching delivery. Of the 24 instruments used in the included studies, 6 were high-quality (achieved ≥3 types of established validity evidence) and these were used in 14 (29%) of the 52 studies that measured EBP skills; 14 (41%) of the 39 studies that measured EBP knowledge; and 8 (26%) of the 35 studies that measured EBP attitude.ConclusionsMost EBP educational interventions which have been evaluated in controlled studies focus on teaching only some of the EBP steps (predominantly critically appraisal of evidence) and did not use high-quality instruments to measure outcomes. Educational packages and instruments which address all EBP steps are needed to improve EBP teaching.

Highlights

  • Despite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak

  • We updated the search of a previously conducted systematic review of studies which evaluated the effect of EBP educational interventions [14] to find additional studies and extract additional information on content, outcome domains and EBP instruments

  • Similar to previous studies [7, 8], we found that the majority of evaluated EBP educational interventions focus on critically appraising evidence (EBP Step 3), often to the exclusion of other steps

Read more

Summary

Introduction

Despite the established interest in evidence-based practice (EBP) as a core competence for clinicians, evidence for how best to teach and evaluate EBP remains weak. Despite the established interest in evidence-based practice (EBP) as a core competency for clinicians, evidence for how to effectively teach it remains suboptimal. The disproportionate focus on critical appraisal compared to the other four steps in the EBP process (question formulation, searching, applying, and self-assessment) is a major shortcoming of the current literature for teaching EBP [6,7,8]. A review of 20 EBP educational interventions for undergraduate medical students found that these interventions stressed certain EBP steps (asking clinical question, acquire evidence, and critical appraisal) but pay less attention to others (apply, and assess and reflect) [9]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call