Abstract
Assessment plays a key role in the learning process. The validity of any given assessment tool should ideally be established. If an assessment is to act as a guide to future teaching and learning then its predictive validity must be established. To assess the ability of an objective structured clinical examination (OSCE) taken at the end of the first clinical year of an undergraduate medical degree to predict later performance in clinical examinations. Performance of two consecutive cohorts of year 3 medical undergraduates (n=138 and n=128) in a 23 station OSCE were compared with their performance in 5 subsequent clinical examinations in years 4 and 5 of the course. Poor performance in the OSCE was strongly associated with later poor performance in other clinical examinations. Students in the lowest three deciles of OSCE performance were 6 times more likely to fail another clinical examination. Receiver operating characteristic curves were constructed as a method to criterion reference the cut point for future examinations. Performance in an OSCE taken early in the clinical course strongly predicts later clinical performance. Assessing subsequent student performance is a powerful tool for assessing examination validity. The use of ROC curves represents a novel method for determining future criterion referenced examination cut points.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.