Background and aims: Competent endoscopic ultrasound (EUS) performance requires a combination of technical, cognitive, and non-technical skills. Direct observation assessment tools can be employed to enhance learning and ascertain clinical competence; however, there is a need to systematically evaluate validity evidence supporting their use. We aimed to evaluate the validity evidence of competency assessment tools for EUS and examine their educational utility. Methods: We systematically searched five databases and grey literature for studies investigating EUS competency assessment tools from inception to May 2023. Data on validity evidence across five domains (content, response process, internal structure, relations to other variables, and consequences) were extracted and graded (maximum score 15). We evaluated educational utility using the Accreditation Council for Graduate Medical Education framework and methodological quality using the Medical Education Research Quality Instrument (MERSQI). Results: From 2081 records, we identified 5 EUS assessment tools from 10 studies. All tools are formative assessments intended to guide learning, with 4 employed in clinical settings. Validity evidence scores ranged from 3 to 12. The EUS and ERCP Skills Assessment Tool (TEESAT), Global Assessment of Performance and Skills in EUS (GAPS-EUS), and the EUS Assessment Tool (EUSAT) had the strongest validity evidence with scores of 12, 10, and 10, respectively. Overall educational utility was high given ease of tool use. MERSQI scores ranged from 9.5-12 (maximum score 13.5). Conclusions: The TEESAT, GAPS-EUS, and EUSAT demonstrate strong validity evidence for formative assessment of EUS and are easily implemented in educational settings to monitor progress and support learning.
Read full abstract