Abstract

BackgroundIn-training examination (ITE) has been widely adopted as an assessment tool to measure residents' competency. We incorporated different formats of assessments into the emergency medicine (EM) residency training program to form a multimodal, multistation ITE. This study was conducted to examine the cost and effectiveness of its different testing formats.MethodsWe conducted a longitudinal study in a tertiary teaching hospital in Taiwan. Nine EM residents were enrolled and followed for 4 years, and the biannual ITE scores were recorded and analyzed. Each ITE consisted of 8–10 stations and was categorized into four formats: multiple-choice question (MCQ), question and answer (QA), oral examination (OE), and high-fidelity simulation (HFS) formats. The learner satisfaction, validity, reliability, and costs were analyzed.Results486 station scores were recorded during the 4 years. The numbers of MCQ, OE, QA, and HFS stations were 45 (9.26%), 90 (18.5%), 198 (40.7%), and 135 (27.8%), respectively. The overall Cronbach's alpha reached 0.968, indicating good overall internal consistency. The correlation with EM board examination was highest for HFS (ρ = 0.657). The average costs of an MCQ station, an OE station, and an HFS station were ~3, 14, and 21 times that of a QA station.ConclusionsMulti-dimensional assessment contributes to good reliability. HFS correlates best with the final training exam score but is also the most expensive format among ITEs. Increased testing domains with various formats improve ITE's overall reliability. Program directors must understand each test format's strengths and limitations to bring forth the best combination of exams under the local context.

Highlights

  • In-training examination (ITE) has been widely adopted as an assessment tool to measure residents’ competency

  • Individual progressions are presented as colored lines, and the average percentile scores of this cohort compared with all the residents in the program are presented as green bars

  • Our study examined the implementation of a multiformat ITE in an emergency medicine (EM) residency training program and demonstrated its validity and reliability

Read more

Summary

Introduction

In-training examination (ITE) has been widely adopted as an assessment tool to measure residents’ competency. We incorporated different formats of assessments into the emergency medicine (EM) residency training program to form a multimodal, multistation ITE. This study was conducted to examine the cost and effectiveness of its different testing formats. Various medical specialties have adopted In-training examinations (ITEs) as a powerful and multifunctional assessment tool to measure residents’ competency [3–5]. Simulations have been used in medical education since the 1960s [9] They have been integrated as a component of curricula emphasizing core competency and communication skills for emergency medicine (EM) residents [10, 11]. High-fidelity simulation (HFS), which uses computer-controlled manikins, has been demonstrated to be realistic and effective in medical education [19, 20]. The high cost of HFS is a major obstacle to its implementation [24, 25]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call