Abstract

Improving the reliability and consistency of objective structured clinical examination (OSCE) raters’ marking poses a continual challenge in medical education. The purpose of this study was to evaluate an e-Learning training module for OSCE raters who participated in the assessment of third-year medical students at the University of Ottawa, Canada. The effects of online training and those of traditional in-person (face-to-face) orientation were compared. Of the 90 physicians recruited as raters for this OSCE, 60 consented to participate (67.7%) in the study in March 2017. Of the 60 participants, 55 rated students during the OSCE, while the remaining 5 were back-up raters. The number of raters in the online training group was 41, while that in the traditional in-person training group was 19. Of those with prior OSCE experience (n= 18) who participated in the online group, 13 (68%) reported that they preferred this format to the in-person orientation. The total average time needed to complete the online module was 15 minutes. Furthermore, 89% of the participants felt the module provided clarity in the rater training process. There was no significant difference in the number of missing ratings based on the type of orientation that raters received. Our study indicates that online OSCE rater training is comparable to traditional face-to-face orientation.

Highlights

  • In the context of objective structured clinical examinations (OSCEs), raters are typically provided with an orientation to ensure familiarity with the rating instruments used and to define standards for acceptable performance [1]

  • Forty-one raters were allocated to the in-person orientation, and 19 were allocated to the online orientation

  • The convenience and flexibility of an online format for OSCE raters was appealing across a spectrum of experiences

Read more

Summary

Introduction

In the context of objective structured clinical examinations (OSCEs), raters are typically provided with an orientation to ensure familiarity with the rating instruments used and to define standards for acceptable performance [1]. At the University of Ottawa, we use resident physicians and faculty physicians as raters for undergraduate medical student OSCEs. Raters receive an in-person orientation prior to each OSCE to ensure that they understand their required tasks (e.g., assessing students and/or providing feedback in formative OSCEs). One of the challenges with in-person orientations is that raters have conflicting clinical duties that prevent their attendance; they may miss important aspects of such training. To address this issue, we developed an online rater training module for an undergraduate OSCE (Supplement 1)

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call