Abstract

Delivering quality lectures is a critical skill for residents seeking careers in academia yet no validated tools for assessing resident lecture skills exist. The authors sought to develop and validate a lecture assessment tool. Using a nominal group technique, the authors derived a behaviorally anchored assessment tool. Baseline characteristics of resident lecturers including prior lecturing experience and perceived comfort with lecturing were collected. Faculty and senior residents used the tool to assess lecturer performance at weekly conference. A postintervention survey assessed the usability of the form and the quantity and quality of the feedback. Analysis of variance was used to identify relationships in performance within individual domains to baseline data. Generalizability coefficients and scatterplots with jitter were used to assess inter-rater reliability. Of 64 residents assessed, most (68.8%) had previous lecturing experience and 6.3% had experience as a regional/national speaker. There was a significant difference in performance within the domains of Content Expertise (p<0.001), Presentation Design/Structure (p=0.014), and Lecture Presence (p=0.001) for first-year versus fourth-year residents. Residents who had higher perceived comfort with lecturing performed better in the domains of Content Expertise (p=0.035), Presentation Design/Structure (p=0.037), and Lecture Presence (p<0.001). We found fair agreement between raters in all domains except Goals and Objectives. Both lecturers and evaluators perceived the feedback delivered as specific and of adequate quantity and quality. Evaluators described the form as highly useable. The derived behaviorally anchored assessment tool is a sufficiently valid instrument for the assessment of resident-delivered lectures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call