Abstract

BackgroundThere is currently an absence of valid and relevant instruments to evaluate how Evidence-based Practice (EBP) training improves, beyond knowledge, physicians’ skills. Our aim was to develop and test a tool to assess physicians’ EBP skills.MethodsThe tool we developed includes four parts to assess the necessary skills for applying EBP steps: clinical question formulation; literature search; critical appraisal of literature; synthesis and decision making. We evaluated content and face validity, then tested applicability of the tool and whether external observers could reliably use it to assess acquired skills. We estimated Kappa coefficients to measure concordance between raters.ResultsTwelve general practice (GP) residents and eleven GP teachers from the University of Bordeaux, France, were asked to: formulate four clinical questions (diagnostic, prognosis, treatment, and aetiology) from a proposed clinical vignette, find articles or guidelines to answer four relevant provided questions, analyse an original article answering one of these questions, synthesize knowledge from provided synopses, and decide about the four clinical questions. Concordance between two external raters was excellent for their assessment of participants’ appraisal of the significance of article results (K = 0.83), and good for assessment of the formulation of a diagnostic question (K = 0.76), PubMed/Medline (K = 0.71) or guideline (K = 0.67) search, and of appraisal of methodological validity of articles (K = 0.68).ConclusionsOur tool allows an in-depth analysis of EBP skills, thus could supplement existing instruments focused on knowledge or specific EBP step. The actual usefulness of such tools to improve care and population health remains to be evaluated.

Highlights

  • There is currently an absence of valid and relevant instruments to evaluate how Evidence-based Practice (EBP) training improves, beyond knowledge, physicians’ skills

  • EBP has been described as having five steps [15, 16]: 1) Formulate a clear clinical question about a patient’s problem; 2) Search the literature, with an appropriate strategy, for relevant articles [17]; 3) Critically appraise the evidence for its validity, clinical relevance and applicability; 4) Implement the useful findings back into clinical practice [18]; and 5) Evaluate the impact. This approach is useful in general practice (GP) to manage primary care situations, where it has been described as the sound simultaneous use of a Rousselot et al BMC Medical Education (2018) 18:254 critical research-based approach and a person-centred approach [19, 20]

  • We developed the first French-language tool to assess EBP skills of general practitioners

Read more

Summary

Introduction

There is currently an absence of valid and relevant instruments to evaluate how Evidence-based Practice (EBP) training improves, beyond knowledge, physicians’ skills. EBP has been described as having five steps [15, 16]: 1) Formulate a clear clinical question about a patient’s problem; 2) Search the literature, with an appropriate strategy, for relevant articles [17]; 3) Critically appraise the evidence for its validity, clinical relevance and applicability; 4) Implement the useful findings back into clinical practice [18]; and 5) Evaluate the impact This approach is useful in general practice (GP) to manage primary care situations, where it has been described as the sound simultaneous use of a Rousselot et al BMC Medical Education (2018) 18:254 critical research-based approach and a person-centred approach [19, 20]. These trials were often based on subjective judgements, due to the lack of reliable and valid tools to assess EBP skills [13, 14, 25,26,27,28]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call