Abstract

A prospective collection and analysis of examination marks for three consecutive academic years was undertaken to determine the correlation between the objective structured clinical examination (OSCE) and various other components of medical students' examination, and also to examine how discriminatory the OSCE examination is in the assessment of clinical competence. Out of 388 students, 96·3% passed the examination at the first attempt. Of those who passed, 15·5% had merit while 1·0% had a distinction. When the OSCE component was excluded from the analysis, 11·6% and 2·8% had merits and distinctions, respectively, but when the clinical examination was excluded, there were 17·8% and 6·2% merits and distinctions, respectively. Correlations between the various components of the examination were significant except for that between the clinical examination and the project. Although the OSCE and clinical components of examinations for clinical students are complementary, the OSCE component awards more merit and distinction categories. Although there were statistically significant correlations between the various components of the examination, only 11% of the variability in clinical examination scores could be explained by the performance in OSCE suggesting that the different components are testing different aspects of the student's clinical competence. Completely replacing clinicals with an OSCE may not necessarily be the best way of assessing medical students' clinical competence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call