Abstract

ObjectivesTo develop an Objective Structured Clinical Examination (OSCE) station to assess the evaluation skills of medical students in applying evidence and appropriate treatment options in critical situations with a simulated patient. To assess the results using discrimination and reliability comparison of standardized and simulated patient stations. Materials and MethodsOSCE performance scores of 58 7th-year medical students at the University of Tzu-Chi School of Medicine were analyzed from April 10, 2011 to April 11, 2011 using descriptive statistics and item discrimination. Thirteen OSCE cases were identified for evaluation; we compared the results of all the stations to those of the station with the critical clinical scenario. ResultsDiscrimination statistics indicated that only the critical scenario station prepared with a high-fidelity simulator was effective in distinguishing between high-scoring and low-scoring medical students. ConclusionFailure to design a skill assessment tool is a missed opportunity to understand more fully and apply the results of the clinical performance of medical students.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call