Abstract

Abstract: The Cornell Critical Thinking Test (CCTT) is one of the many multiple‐choice tests with validated questions that have been reported to measure general critical thinking (CT) ability. One of the IFT Education Standards for undergraduate degrees in Food Science is the emphasis on the development of critical thinking. While this skill is easy to list as a student‐learning objective, measuring gains in CT is relatively difficult. If the majority of the class time is spent discussing and solving ill‐defined problems, then will students become actively and meaningfully involved in their own learning and will there be any gains in CT skills? To measure gains using this format, the CCTT was administered as a pre‐ and posttest to Food Science and Human Nutrition students in an Experimental Foods class taught every fall over an 8 y period (2001–2008). Statistical analysis indicated that in 2 of the years (2002 and 2004), there were significant gains (P values 0.036 and 0.045, respectively) in CT scores. Furthermore, in both years, there were significant gains in the same 2 aspects of CT (deduction and assumption) and not in the other aspects. However, we suggest that completing several take‐home exams with many open‐ended questions, writing detailed laboratory reports, and documenting unsolicited student reflections in journal entries that comment on apparent gains in CT skills may be a better indication of actual gains in CT skills compared to the actual CCTT test scores.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call