Abstract

To evaluate a 3-year experience with the Objective Structured Clinical Examinations (OSCEs) and to compare faculty expectations with resident performance. Descriptive analysis of measures of resident performance. Community-based pediatric residency program in Michigan. One hundred twenty-six pediatric residents at all levels of training. The three examinations consisted of 36 to 42 5-minute stations, testing skills in physical examination, history, counseling, telephone management, and test interpretation. A committee of faculty and chief residents predetermined minimum pass levels for each resident level. Results were compared with other indices of resident performance. There was evidence for content, construct, and concurrent validity, as well as a high degree of reliability. However, 40% to 96% of residents scored below the minimum pass levels for their levels. In each examination, third-year residents had the highest failure rates, yet they scored well on the American Board of Pediatrics in-training examination and on their monthly clinical evaluations. Furthermore, for residents at all levels, the scores reflecting application of data were significantly lower than those assessing data gathering. The gaps between expectations and performance, and between data gathering and application, have important implications for institutional educational philosophy, suggesting a shift toward more clinically oriented and learner-directed strategies in the design of instructional and evaluation methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call