Abstract

Objective Structured Clinical Examinations (OSCEs) are used at the majority of U.S. medical schools. Given the high resource demands with constructing and administering OSCEs, understanding how OSCEs relate to typical performance measures in medical school could help educators more effectively design curricula and evaluation to optimize student instruction and assessment. To investigate the correlation between second-year and third-year OSCE scores, as well as the associations between OSCE scores and several other typical measures of students' medical school performance. We tracked the performance of a 5-year cohort (classes of 2007-2011). We studied the univariate correlations among OSCE scores, U.S. Medical Licensing Examination (USMLE) scores, and medical school grade point average. We also examined whether OSCE scores explained additional variance in the USMLE Step 2 Clinical Knowledge score beyond that explained by the Step 1 score. The second- and third-year OSCE scores were weakly correlated. Neither second- nor third-year OSCE score was strongly correlated with USMLE scores or medical school grade point average. Our findings suggest that OSCEs capture a viewpoint that is different from typical assessment measures that largely reflect multiple choice questions; these results also support tenets of situated cognition theory.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.