Abstract
PurposeMock oral examinations (MOE) prepare general surgery residents for the American Board of Surgery Certifying Exam by assessing their medical knowledge and clinical judgement. There is no standard accepted process for quality analysis among MOE content items. Effective questions should correlate with mastery of MOE content, as well as exam passage. Our aim was to identify opportunities for question improvement via item analysis of a standardized MOE.MethodsRetrospective review of testing data from the 2022 Southern California Virtual MOE, which examined 64 general surgery residents from six training programs. Each resident was assessed with 73 exam questions distributed through 12 standardized cases. Study authors indexed questions by clinical topic (e.g. breast, trauma) and competency category (e.g. professionalism, operative approach). We defined MOE passage as mean percentage correct and mean room score within 1 standard deviation of the mean or higher. Questions were assessed for difficulty, discrimination between PGY level, and correlation with MOE passage.ResultsPassage rate was 77% overall (49/64 residents), with no differences between postgraduate year (PGY) levels. PGY3 residents answered fewer questions correctly vs PGY4 residents (72% vs 78%, p < 0.001) and PGY5 residents (72% vs 82%, p < 0.001). Out of 73 total questions, 17 questions (23.2%) significantly correlated with MOE passage or failure. By competency category, these were predominantly related to patient care (52.9%) and operative approach (23.5%), with fewer related to diagnosis (11.8%), professional behavior (5.9%), and decision to operate (5.9%). By clinical topic these were equally distributed between trauma (17.7%), large intestine (17.7%), endocrine (17.7%), and surgical critical care (17.7%), with fewer in breast (11.8%), stomach (11.8%), and pediatric surgery (5.9%). We identified two types of ineffective questions: 1) questions answered correctly by 100% of test-takers with no discriminatory ability (n = 3); and 2) questions that varied inversely with exam passage (n = 11). In total, 19% (14/73) of exam questions were deemed ineffective.ConclusionsItem analysis of multi-institutional mock oral exam found that 23% of questions correlated with exam passage or failure, effectively discriminating which examinees had mastery of MOE content. We also recognized 19% of questions as ineffective that can be targeted for improvement.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Global Surgical Education - Journal of the Association for Surgical Education
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.