Abstract

The Accreditation Council for Graduate Medical Education (ACGME) requires programs to engage annually in program evaluation and improvement. We assessed the value of creating educational competency committees (ECCs) that use successful elements of 2 established processes-institutional special reviews and institutional oversight of annual program evaluations. The ECCs used a template to review programs' annual program evaluations. Results were aggregated into an institutional dashboard. We calculated the costs, sensitivity, specificity, and predictive value by comparing programs required to have a special review with those that had ACGME citations, requests for a progress report, or a data-prompted site visit. We assessed the value for professional development through a participant survey. Thirty-two ECCs involving more than 100 individuals reviewed 237 annual program evaluations over a 3-year period. The ECCs required less time than internal reviews. The ECCs rated 2 to 8 programs (2.4%-9.8%) as "noncompliant." One to 13 programs (1.2%-14.6%) had opportunities for improvement identified. Institutional improvements were recognized using the dashboard. Zero to 13 programs (0%-16%) were required to have special reviews. The sensitivity of the decision to have a special review was 83% to 100%; specificity was 89% to 93%; and negative predictive value was 99% to 100%. The total cost was $280 per program. Of the ECC members, 86% to 95% reported their participation enhanced their professional development, and 60% to 95% believed the ECC benefited their program. Educational competency committees facilitated the identification of institution-wide needs, highlighted innovation and best practices, and enhanced professional development. The cost, sensitivity, specificity, and predictive value indicated good value.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call