The objectives of this study were to: 1) implement web-based instruments for assessing and documenting the general competencies of otolaryngology resident education, as outlined by the Accreditation Council of Graduate Medical Education (ACGME); and 2) examine the benefit and validity of this online system for measuring educational outcomes and for identifying insufficiencies in the training program as they occur. We developed an online assessment system for a surgical postgraduate education program and examined its feasibility, usability, and validity. Evaluations of behaviors, skills, and attitudes of 26 residents were completed online by faculty, peers, and nonphysician professionals during a 3-year period. Analyses included calculation and evaluation of total average performance scores of each resident by different evaluators. Evaluations were also compared with American Board of Otolaryngology-administered in-service examination (ISE) scores for each resident. Convergent validity was examined statistically by comparing ratings among the different evaluator types. Questionnaires and software were found to be simple to use and efficient in collecting essential information. From July 2002 to June 2005, 1,336 evaluation forms were available for analysis. The average score assigned by faculty was 4.31, significantly lower than that by nonphysician professionals (4.66) and residents evaluating peers (4.63) (P < .001), whereas scores were similar between nonphysician professionals and resident peers. Average scores between faculty and nonphysician groups showed correlation in constructs of communication and relationship with patients, but not in those of professionalism and documentation. Correlation was observed in respect for patients but not in medical knowledge between faculty and resident peer groups. Resident ISE scores improved in the third year of the study and demonstrated high correlation with faculty perceptions of medical knowledge (r = 0.65, P = .007). Compliance for completion of forms was 97%. The system facilitated the educational management of our training program along multiple dimensions. The small perceptual differences among a highly selected group of residents have made the unambiguous validation of the system challenging. The instruments and approach warrant further study. Improvements are likely best achieved in broad consultation among other otolaryngology programs.
Read full abstract