Abstract
To compare scoring methods for objective structured clinical examinations (OSCEs) using real-time observations via video monitors and observation of videotapes. Second- (P2) and third-year (P3) doctor of pharmacy (PharmD) students completed 3-station OSCEs. Sixty encounters, 30 from each PharmD class, were selected at random, and scored by faculty investigators observing video monitors in real-time. One month later, the encounters were scored by investigators using videotapes. Intra-rater reliability between real-time and videotaped observation was excellent (ICC 3,1 of 0.951 for P2 students and 0.868 for P3 students). However, 13.3% of students' performance in both P2 and P3 cohorts changed in pass/fail determination from passing based on real-time observation to failing based on video observation, and 3.3% of students changed from failing real-time to passing on video. Despite excellent overall reliability, important differences in OSCE pass/fail determinations were found between real-time and video observations. These observation methods for scoring OSCEs are not interchangeable.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.