Abstract

Background: Objective Structured Clinical Examinations (OSCEs) are a simulation-based assessment tool used extensively in medical education for evaluating clinical competence. OSCEs are widely regarded as more valid, reliable, and valuable compared to traditional assessment measures, and are now emerging within professional psychology training programs. While there is a lack of findings related to the quality of OSCEs in published psychology literature, psychometric properties can be inferred by investigating implementation. Accordingly, the current review assessed implementation of OSCEs within psychology programs against a set of Quality Assurance Guidelines (QAGs). Methods: A systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) recommendations. Electronic databases including ProQuest Psychology, PsycArticles, Psychology and Behavioural Sciences Collection, PsycInfo and key indexing databases such as Scopus, ProQuest, and Web of Science were used to identify relevant articles. Twelve full-text articles met all inclusion criteria and were included in the review. Results: There was considerable heterogeneity in the quality of studies and reporting of OSCE data. Implementation of OSCEs against QAGs revealed overall adherence to be “Fair.” Conclusion: The current review consolidated what is known on psychometric quality of OSCEs within psychology programs. A further need for quantitative evidence on psychometric soundness of OSCEs within psychology training is highlighted. Furthermore, it is recommended that future training programs implement and report OSCEs in accordance with standardized guidelines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call