Abstract

Purpose: The purpose of this study was to evaluate the feasibility of applying coactivity scales to entrustable professional activity (EPA)-based assessments by comparing (1) the ratings of students performing EPAs with increasing levels of difficulty and (2) the ratings of cohorts of students performing the same EPAs over time. Background: Clinical clerkships provide critical opportunities to evaluate medical student performance on Association of American Medical Colleges (AAMC) Core EPAs for Entering Residency. However, there is no standardized method for assigning entrustability. 1 The AAMC proposes the use of the Modified Ottawa coactivity scale to anchor entrustability assessments. 2 Design: We adapted a modified PRIME scale (professionalism, reporter, interpreter, and manager) to assign a developmental hierarchy to the EPAs. 3 To assess student skills in selected EPAs, we used the modified Ottawa coactivity scale, and assigned a value of 1–4 in order of increasing entrustability to anchors in the scale. Mean ratings of EPAs over time were calculated and compared. IBM SPSS Statistics version 25 (IBM, Armonk, New York) was used for descriptive analyses, ANOVA, and general linear models. Outcomes: A total of 2,623 evaluations (Cronbach’s alpha of 0.927) were completed for 247 medical students and showed a significant increase over time for professionalism (P = .011), reporter (P = .007), interpreter (P = .004), and manager (P = .007). EPAs with increasing intellectual demand were associated with lower student EPA ratings: 3.51 (±0.50) for professionalism, 3.45 (±0.47) for reporter, 3.35 (±0.59) for interpreter, and 3.28 (±0.60) for manager. Strengths and Limitations: The University of Miami EPA assessment tool provides a useful measure of student readiness for patient care. Raters entrusted students more with skills of professionalism, reporting, and interpreting than with managing, a more sophisticated skill. In every domain, entrustability increased with clinical experience. A limitation of this study is that the tool was used at a single institution. Feasibility and Transferability: The tool can potentially be deployed across multiple clerkships at other medical schools as an effective and practical tool for medical student evaluation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call