Abstract

Introduction: In 2018, Canadian postgraduate specialist Emergency Medicine (EM) programs began implementing a competency-based medical education (CBME) assessment system. To support improvement of this assessment program, we sought to evaluate its short-term educational outcomes nationally and within individual programs. Methods: Program-level data from the 2018 resident cohort were amalgamated and analyzed. The number of Entrustable Professional Activity (EPA) assessments (overall and for each EPA) and the timing of resident promotion through program stages was compared between programs and to the guidelines provided by the national EM specialty committee. Total EPA observations from each program were correlated with the number of EM and pediatric EM rotations. Results: Data from 15 of 17 (88.2%) EM programs containing 9,842 EPA observations from 68 of the 77 (88.3%) Canadian EM specialist residents in the 2018 cohort were analyzed. The average number of EPAs observed per resident in each program varied from 92.5 to 229.6 and correlated strongly with the number of blocks spent on EM and pediatric EM (r = 0.83, p < 0.001). Relative to the guidelines outlined by the specialty committee, residents were promoted later than expected and with fewer EPA observations than suggested. Conclusion: We present a new approach to the amalgamation of national and program-level assessment data. There was demonstrable variation in both EPA-based assessment numbers and promotion timelines between programs and with national guidelines. This evaluation data will inform the revision of local programs and national guidelines and serve as a starting point for further reaching outcome evaluation. This process could be replicated by other national assessment programs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call