Abstract

The objective structured examination is one of the most valid, reliable, and effective tools for assessing clinical and communication skills, often by use of standard patients (SPs). SPs can also be assessors of those skills. One of the crucial areas when utilizing SP-based assessment is the quality and consistency assurance of their portrayal of the case and their ability to fill in checklists in an adequate way. The aim of this study was to assess the validity and reliability of SPs' ability to assess students' communication skill via a Calgary-Cambridge checklist. This cross-sectional and correlational study was conducted at the Tehran University of Medical Science. We first analyzed validity; the criterion validity of the SPs' filling in the checklists was assessed through determining the correlation between the SPs' completed checklists and the checklists filled in by three physician raters individually and then reproducibility: it was assessed by a test-retest approach inter-rater reliability. The mean correlation for assessing the validity of SPs' completed checklists by individual SPs was 0.81. The inter-rater reliability was calculated by kappa coefficient, and the total correlation among the three raters was 0.85. The reliability of the test-retest approach showed no significant differences between the test and re-test results. The increased number of medical students and different faculties' responsibilities such as doing educational, research, and health services duties assessing medical student communication skills is a complex issue. The results of our study showed that trained SPs can be used as a valid tool to assess medical students' communication skills, which is also more cost effective and reduces work load of medical faculties.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call