Abstract

Surgical competency requires sound clinical judgment, a systematic diagnostic approach, and integration of a wide variety of nontechnical skills. This more complex aspect of clinician development has traditionally been difficult to measure through standard assessment methods. This study was conducted to use the Clinical Practice Instrument (CPI) to measure nontechnical diagnostic and management skills during otolaryngology residency training; to determine whether there is demonstrable change in these skills between residents who are in postgraduate years (PGYs) 2, 4, and 5; and to evaluate whether results vary according to subspecialty topic or method of administration. Prospective study using the CPI, an instrument with previously established internal consistency, reproducibility, interrater reliability, discriminant validity, and responsiveness to change, in an otolaryngology residency training program. The CPI was used to evaluate progression in residents' ability to evaluate, diagnose, and manage case-based clinical scenarios. A total of 248 evaluations were performed in 45 otolaryngology resident trainees at regular intervals. Analysis of variance with nesting and postestimation pairwise comparisons were used to evaluate total and domain scores according to training level, subspecialty topic, and method of administration. Longitudinal residency educational initiative. Assessment with the CPI during PGYs 2, 4, and 5 of residency. Among the 45 otolaryngology residents (248 CPI administrations), there were a mean (SD) of 5 (3) administrations (range, 1-4) during their training. Total scores were significantly different among PGY levels of training, with lower scores seen in the PGY-2 level (44 [16]) compared with the PGY-4 (64 [13]) or PGY-5 level (69 [13]) (P < .001). Domain scores related to information gathering and organizational skills were acquired earlier in training, while knowledge base and clinical judgment improved later in residency. Trainees scored higher in general otolaryngology (mean [SD], 72 [14]) than in subspecialties (range, 55 [12], P = .003, to 56 [19], P < .001). Neither administering the examination with an electronic scoring system, rather than a paper-based scoring system, nor the calendar year of administration affected these results. Standardized interval evaluation with the CPI demonstrates improvement in qualitative diagnostic and management capabilities as PGY levels advance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call