Abstract

BackgroundEmergency Medicine (EM) clerkships traditionally assess students using numerical ratings of clinical performance. The descriptive ratings of the Reporter, Interpreter, Manager, and Educator (RIME) method have been shown to be valuable in other specialties. ObjectivesWe hypothesized that the RIME descriptive ratings would correlate with clinical performance and examination scores in an EM clerkship, indicating that the RIME ratings are a valid measure of performance. MethodsThis was a prospective cohort study of an evaluation instrument for 4th-year medical students completing an EM rotation. This study received exempt Institutional Review Board status. EM faculty and residents completed shift evaluation forms including both numerical and RIME ratings. Students completed a final examination. Mean scores for RIME and clinical evaluations were calculated. Linear regression models were used to determine whether RIME ratings predicted clinical evaluation scores or final examination scores. ResultsFour hundred thirty-nine students who completed the EM clerkship were enrolled in the study. After excluding items with missing data, there were 2086 evaluation forms (based on 289 students) available for analysis. There was a clear positive relationship between RIME category and clinical evaluation score (r2=0.40, p<0.01). RIME ratings correlated most strongly with patient management skills and least strongly with humanistic qualities. A very weak correlation was seen with RIME and final examination. ConclusionWe found a positive association between RIME and clinical evaluation scores, suggesting that RIME is a valid clinical evaluation instrument. RIME descriptive ratings can be incorporated into EM evaluation instruments and provides useful data related to patient management skills.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call