Purpose: Assessment of medical students during clerkships is largely driven by clinical performance ratings, but many students perceive clinical evaluations to be “unfair” and call for more training of evaluators. 1 Limitations in self-assessment may contribute to this perception, 2 and frame-of-reference training may improve students’ ability to recognize different levels of performance. 3 The objective of this study was to determine if an online, frame-of-reference training module can improve students’ understanding of clerkship expectations for clinical performance in the medicine clerkship. Methods: Core faculty in the medicine clerkship developed multiple case presentations that demonstrated different levels of performance in the Reporter, Interpreter, Manager, and Educator (RIME) components of our clerkship evaluation form. The case presentations were revised in an iterative fashion to reach a consensus in defining each of the case presentations as below, at, or exceeding expectations for clerkship students. Using these presentations, an online, interactive frame-of-reference training tool was developed where students, before starting clerkships, rated a set of case presentations, compared their ratings with those determined by core faculty (correct ratings), and received feedback about why a given presentation merits a particular rating. At the start of the clerkship, students completed another training module with a new set of case presentations, either remotely on their own (odd-numbered clerkship blocks) or in-person with group discussion facilitated by the clerkship director (even-numbered blocks). We used chi-square tests to compare proportion of students choosing correct ratings on case presentations at baseline versus in-clerkship training and to compare student responses between remote versus in-person training groups with an end-of-clerkship survey on the effect of training on their understanding of clerkship expectations. Results: All rising third-year students (N = 177) completed baseline training, and 140 students (100% enrolled in blocks 1–6 of medicine clerkship) completed the in-clerkship training. Students enrolled in blocks 7 and 8 were excluded due to COVID-related cancelations/rescheduling of clinical rotations. Overall, the percentage of cases answered correctly at baseline was 64.8% and improved to 74.5% at in-clerkship training (P < .001). In looking at individual domains of RIME, improvements were seen for all domains except interpreter:reporter 55.7% to 67.6% (P = .002), interpreter 74.3% to 62.8% (P = .001), manager 68.0% to 82.5% (P < .0001), and educator 62.5% to 85.1% (P < .0001). The pattern of improvement was not statistically different between remote versus in-person training groups. The majority of students agreed or strongly agreed that the online training improved their understanding of clerkship expectations (66.0% of remote versus 75.0% of in-person training groups, P = .604). Discussion: In this study of frame-of-reference training for students, we found a significant increase in proportion of cases rated correctly from baseline to in-clerkship training. The lack of improvement in the interpreter domain likely reflects the challenges posed by clinical reasoning required for interpreter, which is a component of RIME students are most uncomfortable with during preclinical years, and a skill most heavily emphasized in the medicine clerkship. Repeating the training module at the end of the medicine clerkship may have shown different results. Our finding that student perception regarding usefulness of the training was similar between remote versus in-person training is worth noting because faculty time and effort, which are often seen as barriers to many educational efforts, in this case did not significantly improve effectiveness of training. Significance: While much of the effort to improve clinical evaluations has focused on rater training, this study assessed whether training students can improve their understanding of clerkship expectations. Our findings show that a brief, online, frame-of-reference training tool can improve student understanding of different levels of clinical performance. Acknowledgments: The authors thank their faculty and medical students for their contribution and participation in this study.
Read full abstract