Abstract

This study had two purposes: determining the reliability and validity of the Objective Structured Clinical Examination (OSCE) in assessing performance by trainees at all levels, including medical students and chief residents; and estimating the impact of providing OSCE participants with immediate feedback about their performance. A comprehensive 210-min OSCE was administered to 53 surgical residents and 6 junior medical students. Faculty experts proctored all patient stations and provided immediate feedback to participants after the patient interaction segments (Part A). The participants then answered questions about the patients seen (Part B). The reliability of the OSCE was high (.91), identical to that of a previous resident OSCE with no feedback. The standard error of measurement for both parts was approximately 4%. At the 95% confidence interval, each participant's actual level of clinical performance (Part A) and clinical knowledge (Part B) could be estimated with an error of ±8%. Participants showed significant differences in clinical performance (Part A,P< 0.01) and knowledge (Part B,P< 0.01) by level of training. Most participants (74%) rated the OSCE as an above average or outstanding educational method. The OSCE is a valid and reliable test of residents’ clinical skills. Feedback to participants during the OSCE was positively received and did not perturb test reliability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.