Abstract

Evaluation of resident physicians' communications skills is a challenging task and is increasingly accomplished with standardized examinations. There exists a need to identify the effective, efficient methods for assessment of communications skills. We compared objective structured clinical examination (OSCE) and direct observation as approaches for assessing resident communications skills. We conducted a retrospective cohort analysis of orthopaedic surgery resident physicians at a single tertiary care academic institution, using the Institute for Healthcare Communication "4 Es" model for effective communication. Data were collected between 2011 and 2015. A total of 28 residents, each with OSCE and complete direct observation assessment checklists, were included in the analysis. Residents were included if they had 1 OSCE assessment and 2 or more complete direct observation assessments. There were 28 of a possible 59 residents (47%) included. A total of 89% (25 of 28) of residents passed the communications skills OSCE; only 54% (15 of 28) of residents passed the direct observation communications assessment. There was a positive, moderate correlation between OSCE and direct observation scores overall (r = 0.415, P = .028). There was no agreement between OSCE and direct observation in categorizing residents into passing and failing scores (κ = 0.205, P = .16), after adjusting for chance agreement. Our results suggest that OSCE and direct observation tools provide different insights into resident communications skills (simulation of rare and challenging situations versus real-life daily encounters), and may provide useful perspectives on resident communications skills in different contexts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call