Human factors and teamwork are major contributors to sentinel events. A major limitation to improving human factors and teamwork is the paucity of objective validated measurement tools. Our goal was to develop a brief tool that could be used to objectively evaluate teamwork in the field during short clinical team simulations and in everyday clinical care. A pilot validation study. Standardized videos were created demonstrating poor, average, and excellent teamwork among an obstetric team in a common clinical scenario (shoulder dystocia). Three evaluators all trained in Crew Resource Management, and unaware of assigned teamwork level, independently reviewed videos and evaluated teamwork using the Clinical Teamwork Scale (CTS). Statistical analysis included calculation of the Kappa statistic and Kendall coefficient to evaluate agreement and score concordance among raters, and Interclass Correlation Coefficient (ICC) to evaluate interrater reliability. The reliability of the tool was further evaluated by estimating the variance of each component of the tool based on generalizability theory. There was substantial agreement (Kappa 0.78) and score concordance (Kendall coefficient 0.95) among raters, and excellent interrater reliability (interclass correlation coefficient 0.98). The highest percentage of variance in scores among raters was because of rater/item interaction. The CTS was developed to efficiently measure key clinical teamwork skills during simulation exercises and in everyday clinical care. It contains 15 questions in 5 clinical teamwork domains (communication, situational awareness, decision-making, role responsibility, and patient friendliness). It is easy to use and has construct validity with median ratings consistently corresponding with the intended teamwork level. The CTS is a brief, straightforward, valid, reliable, and easy-to-use tool to measure key factors in teamwork in simulated and clinical settings.