Abstract

The DVM program at the University of Calgary offers a Clinical Skills course each year for the first three years. The course is designed to teach students the procedural skills required for entry-level general veterinary practice. Objective Structured Clinical Examinations (OSCEs) were used to assess students' performance on these procedural skills. A series of three OSCEs were developed for the first year. Content was determined by an exam blueprint, exam scoring sheets were created, rater training was provided, a mock OSCE was performed with faculty and staff, and the criterion-referencing Ebel method was used to set cut scores for each station using two content experts. Each station and the overall exam were graded as pass or fail. Thirty first-year DVM students were assessed. Content validity was ensured by the exam blueprint and expert review. Reliability (coefficient α) of the stations from the three OSCE exams ranged from 0.0 to 0.71. The three exam reliabilities (Generalizability Theory) were, for OSCE 1, G=0.56; OSCE 2, G=0.37; and OSCE 3, G=0.32. Preliminary analysis has suggested that the OSCEs demonstrate face and content validity, and certain stations demonstrated adequate reliability. Overall exam reliability was low, which reflects issues with first-time exam delivery. Because this year was the first that this course was taught and this exam format was used, work continues in the program on the teaching of the procedural skills and the development and revision of OSCE stations and scoring checklists.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call