Abstract
The most recent iteration of the Accreditation Council for Graduate Medical Education duty-hour regulations includes language mandating handoff education for trainees and assessments of handoff quality by residency training programs. However, there is a lack of validated tools for the assessment of handoff quality and for use in trainee education. Faculty at 2 sites (University of Chicago and Yale University) were recruited to participate in a workshop on handoff education. Video-based scenarios were developed to represent varying levels of performance in the domains of communication, professionalism, and setting. Videos were shown in a random order, and faculty were instructed to use the Handoff Mini-Clinical Examination Exercise (CEX), a paper-based instrument with qualitative anchors defining each level of performance, to rate the handoffs. Forty-seven faculty members (14 at site 1; 33 at site 2) participated in the validation workshops, providing a total of 172 observations (of a possible 191 [96%]). Reliability testing revealed a Cronbach α of 0.81 and Kendall coefficient of concordance of 0.59 (>0.6 = high reliability). Faculty were able to reliably distinguish the different levels of performance in each domain in a statistically significant fashion (ie, unsatisfactory professionalism mean 2.42 vs satisfactory professionalism 4.81 vs superior professionalism 6.01, P < 0.001 trend test). Two-way analysis of variance revealed no evidence of rater bias. Using standardized video-based scenarios highlighting differing levels of performance, we were able to demonstrate evidence that the Handoff Mini-CEX can draw reliable and valid conclusions regarding handoff performance. Future work to validate the tool in clinical settings is warranted.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have