Abstract

ObjectivesCommunication and other clinical skills are routinely assessed in medical schools using Objective Structured Clinical Examinations (OSCEs) so routinely that it can be difficult to monitor and maintain validity. We report on the accumulation of validity evidence for the Clinical Communication Skills Assessment Tool (CCSAT) based on its use with 9 cohorts of medical students in a high stakes OSCE. MethodsWe describe the implementation of the CCSAT including information on the underlying model, the tool’s items, domains, scales and scoring, and its role in curriculum. Internal structure is explored through item, internal consistency, and confirmatory factor analyses. Evidence for CCSAT validity is synthesized within prevailing frameworks (Messick12 and Kane13) based on continuous quality improvement and use of the CCSAT for feedback, remediation, curricular design, and research. ResultsImplementation of the CCSAT over time has facilitated our communication skills curriculum and training. Thoughtful case development and investment in standardized patient training has contributed to data quality. Item analysis supports our behaviorally anchored scale (not done, partly and well done) and the skills domains suggested by an a priori evidence-based clinical communication model were confirmed via analysis of actual student data. Evidence synthesized across the frameworks suggests consistent validity of the CCSAT for generalization inferences (that it captures the construct), responsiveness (sensitivity to change/difference), content validity/internal structure, relationships to other variables, and consequences/implications. More evidence is needed to strengthen validity of CCSAT scores for understanding extrapolation inferences and real-world implications. Conclusions and Practice ImplicationsThis pragmatic approach to evaluating validity within a program of assessment serves as a model for medical schools seeking to continuously monitor the quality of clinical skill assessments, a need made particularly relevant since the US NBME no longer requires the Step 2 Clinical Skills exam, leaving individual schools with the responsibility for ensuring graduates have acquired the requisite core clinical skills. We document strong evidence for CCSAT validity over time and across cohorts as well as areas for improvement and further examination.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.