Abstract

Teamwork is integral to effective health care but difficult to evaluate. Few tools have been tested outside of classroom or medical simulation settings. Accordingly, we aimed to develop and pilot test an easy-to-use direct observation instrument for measuring teamwork among medical house staff. We performed direct observations of 18 inpatient medicine house staff teams at a teaching hospital using an instrument constructed from existing teamwork tools, expert panel review, and pilot testing. We examined differences across teams using the Kruskal-Wallis statistic. We examined interrater reliability with the κ statistic, domain scales using Cronbach α, and construct validity using correlation and multivariable regression analyses of quality and utilization metrics. Observers rated team performance before and after providing feedback to 12 of the 18 team leaders and assessed changes in team performance using paired two-tailed t tests. We found variation in team performance in the situation monitoring, mutual support, and communication domains. The instrument evidenced good interrater reliability among concurrent, independent observers (κ = 0.7, P < 0.001). It had satisfactory face validity based on expert panel review and the assessments of resident team leaders. Construct validity was supported by a positive correlation between team performance and the Hospital Consumer Assessment of Healthcare Providers and Systems physician communication score (r = 0.6, P = 0.03). Providing resident physicians with information about their teams' performance was associated with improved mean performance in follow-up observations (3.6-3.8/4.0, P = 0.001). Direct observation of teamwork behaviors by medicine house staff on ward rounds is feasible and feedback may improve performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call