Abstract

BACKGROUND : Assessment of competence in endoscopic retrograde cholangiopancreatography (ERCP) is critical for supporting learning and documenting attainment of skill. Validity evidence supporting ERCP observational assessment tools has not been systematically evaluated. METHODS : We conducted a systematic search using electronic databases and hand-searching from inception until August 2021 for studies evaluating observational assessment tools of ERCP performance. We used a unified validity framework to characterize validity evidence from five sources: content, response process, internal structure, relations to other variables, and consequences. Each domain was assigned a score of 0-3 (maximum score 15). We assessed educational utility and methodological quality using the Accreditation Council for Graduate Medical Education framework and the Medical Education Research Quality Instrument, respectively. RESULTS : From 2769 records, we included 17 studies evaluating 7 assessment tools. Five tools were studied for clinical ERCP, one for simulated ERCP, and one for simulated and clinical ERCP. Validity evidence scores ranged from 2 to 12. The Bethesda ERCP Skills Assessment Tool (BESAT), ERCP Direct Observation of Procedural Skills Tool (ERCP DOPS), and The Endoscopic Ultrasound (EUS) and ERCP Skills Assessment Tool (TEESAT) had the strongest validity evidence, with scores of 10, 12, and 11, respectively. Regarding educational utility, most tools were easy to use and interpret, and required minimal additional resources. Overall methodological quality (maximum score 13.5) was strong, with scores ranging from 10 to 12.5. CONCLUSIONS : The BESAT, ERCP DOPS, and TEESAT had strong validity evidence compared with other assessments. Integrating tools into training may help drive learners' development and support competency decision making.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call