IntroductionEnsuring that medical students are challenged with critical thinking problems and can solve them successfully is important to their development as future physicians. Multiple‐choice questions (MCQs), if written correctly, can promote the development of these critical thinking skills. Bloom's taxonomy is a framework used to classify cognitive skills into increasingly complex levels of learning (Remember, Understand, Apply, Analyze, Evaluate, and Create). Adapted versions of Bloom's taxonomy have been used to categorize MCQs into lower order, non‐critical thinking questions and higher‐order, critical thinking questions.Purpose and HypothesisThe purpose of this study was to evaluate the distribution of Gross Anatomy and Development MCQs categorized as either non‐critical thinking questions (Remember or Understand) or critical thinking questions (Apply/Analyze) and analyze student performance on those questions. We hypothesized that students would perform better on non‐critical thinking MCQs compared to critical thinking MCQs.MethodsAt the Medical College of Georgia, the first year medical curriculum consists of systems‐based modular blocks composed of basic science components including Gross Anatomy and Development. Academic performance is measured primarily from MCQ exams for each module. Gross Anatomy and Development MCQs (n=260) from the 2016–2017 academic year were evaluated and sorted into Remember, Understand, or Apply/Analyze categories by four evaluators. Interrater reliability was calculated using Krippendorff's α (α=0.54). Student academic performance (n=192 students) on categorized MCQs was analyzed and compared using a one‐way ANOVA and Tukey's post‐hoc analysis.ResultsOf the 151 Anatomy questions, 20.5% were categorized as Remember, 51% as Understand, and 28.5% as Apply/Analyze. Students scored similarly on all three categories of Anatomy MCQs (Remember: 77.3% ± 18%; Understand: 82.3% ± 13%; Apply/Analyze: 78.7% ± 16%; p=0.202). Of the 109 Development questions, 33% were categorized as Remember, 39.4% as Understand, and 27.5% as Apply/Analyze. Students performed significantly better on Development MCQs categorized as Understand compared to Remember (81.1% ± 11.5% vs. 72.4% ± 15.5%; p=0.015). There was no difference between Understand and Apply/Analyze MCQs (74.6% ± 13.2%; p=0.109) or Remember and Apply/Analyze MCQs (p=0.799).ConclusionStudents performed similarly across all categories of Anatomy MCQs. However, for Development, students performed significantly better on Understand MCQs compared to Remember MCQs. Though Remember questions are one‐step and straightforward, they may be difficult if they involve specific isolated facts, unlike broader concepts tested by Understand questions. Apply/Analyze questions involve multiple steps or require students to connect several pieces of information to conclude the correct answer, which may make them more difficult for students. Overall, fewer MCQs were categorized as critical thinking than expected, as it was initially thought that the majority of MCQs in the curriculum would involve higher‐order reasoning skills. However, classification of MCQs by evaluators was challenging because students may approach questions differently based on prior knowledge and study resources used. This study identified opportunities to improve assessments by incorporating more MCQs that test higher order critical thinking skills.This abstract is from the Experimental Biology 2018 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.