Abstract

The results of a simple method for evaluating pediatric house officer performance and for establishing criteria for the evaluation are presented. House officer ratings are compared to their intern matching plan rankings and their Pediatric Board “pre-test” or regular scores. Finally, a pediatric faculty's educational objectives are compared to performance criteria. A performance rating between 1 (excellent) and 4 (unsatisfactory), which consisted of mean scores of 12 faculty members, was calculated for each of 27 house officers. Interrater reliability was .63. The mean house staff rating was 1.95 ± SD .50 (range 1.16 to 3.15). Correlation coefficients of four and fifteen month reevaluations with the original evaluation were .938 and .888. Six criteria of performance, compassion, knowledge, dependability, critical attitude, teamwork and efficiency, were independently listed by a majority of faculty members as the basis for their evaluation. Ratings correlated neither with Pediatric Board scores nor with intern matching plan rankings. Though the faculty viewed cognitive skills as objectives, they expected competent clinical performance to result from house staff training.We conclude that total clinical performance can be reproducibly rated, that it is predicted neither by Board scores nor by matching plan rankings, and that both cognitive and noncognitive criteria for effective performance of a pediatric house officer's job should be clearly stated by a pediatric faculty.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call