Abstract

Assessment tools are essential for endoscopy training, being required to support feedback provision, optimize learner capabilities, and document competence. We aimed to evaluate the strength of validity evidence that supports the available colonoscopy direct observation assessment tools using the unified framework of validity. We systematically searched five databases for studies investigating colonoscopy direct observation assessment tools from inception until 8 April 2020. We extracted data outlining validity evidence (content, response process, internal structure, relations to other variables, and consequences) from the five sources and graded the degree of evidence, with a maximum score of 15. We assessed educational utility using an Accreditation Council for Graduate Medical Education framework and methodological quality using the Medical Education Research Quality Instrument (MERSQI). From 10 841 records, we identified 27 studies representing 13 assessment tools (10 adult, 2 pediatric, 1 both). All tools assessed technical skills, while 10 each assessed cognitive and integrative skills. Validity evidence scores ranged from 1-15. The Assessment of Competency in Endoscopy (ACE) tool, the Direct Observation of Procedural Skills (DOPS) tool, and the Gastrointestinal Endoscopy Competency Assessment Tool (GiECAT) had the strongest validity evidence, with scores of 13, 15, and 14, respectively. Most tools were easy to use and interpret, and required minimal resources. MERSQI scores ranged from 9.5-11.5 (maximum score 14.5). The ACE, DOPS, and GiECAT have strong validity evidence compared with other assessments. Future studies should identify barriers to widespread implementation and report on the use of these tools in credentialing examinations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call