Abstract

BackgroundThe need for development of critical thinking skills within future medical practitioners is uniformly appreciated among medical educators, as clinical decision‐making represents a somewhat unique and high‐level application of critical reasoning. Therefore, recent efforts have focused on ways to incorporate critical thinking skill‐building into medical curricula. However, the availability of effective and easily translatable methods to assess content‐specific critical thinking skills, particularly in preclinical curricula, have lagged behind; use of observational assessments such as structured observations of performance is limited by the need for significant faculty time and effort. The objective of this study was to develop and evaluate written assessments (in particular, multiple choice questions) designed to assess students' critical thinking in the context of an organ systems‐based preclinical course. Secondarily, we sought to characterize in‐house questions with regard to their correlation with student performance on a norm‐referenced assessment of critical thinking.MethodsStudents (n=183) enrolled in a second‐year medical school course were recruited for participation. Of these, 19 consented. Student performance on three locally developed multiple choice assessments (CTAs, total of 31 items) was evaluated. In addition, participants completed a pre‐ and post‐course administration of the California Critical Thinking Skills Test for healthcare students (HSRT‐N). Results of multiple choice assessments, HSRT‐N, and overall student performance in the course were analyzed.ResultsOf the eight critical thinking skill components assessed by the HSRT‐N, student performance on CTAs correlated strongly with the analysis component score of the HSRT‐N only (r=0.601). Those questions on the CTAs specifically designed to test critical thinking (n=13) correlated weakly with overall HSRT‐N score, while CTA questions not designed to assess critical thinking (n=18) did not correlate with HSRT‐N score (r=0.298 and 0.097, respectively). Student past course performance and overall CTA score predicted global performance in the course (p<0.001, R^2=0.908), though HSRT‐N score did not.ConclusionWhile critical reasoning can be appraised through observational assessments, its evaluation with written selected response (multiple choice) exams has proven difficult for medical educators; in particular, the development of exams locally to assess specific content and skills. Herein we demonstrate early success with developing multiple choice questions aimed at assessing critical thinking, with correlation to a norm‐referenced assessment. Future efforts may focus on outlining a method for generalizable question development that can predictably assess reasoning as related to medical and biomedical content.This abstract is from the Experimental Biology 2018 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.