Abstract

Studies of international mathematics achievement such as the Trends in Mathematics and Science Study (TIMSS) have employed classical test theory and item response theory to rank individuals within a latent ability continuum. Although these approaches have provided insights into comparisons between countries, they have yet to examine how specific attribute mastery affects student performance and how they can provide information for curricular instruction. In the 2007 administration of TIMSS, two benchmark participants—Massachusetts and Minnesota—were tested following the same procedural methods, providing an opportunity for comparison within and across the United States. Overall comparison of their performance showed Massachusetts and Minnesota to significantly outperform the United States. However, this article shows that there is a greater wealth of fine-grained information that can be translated directly for classroom application at the attribute level when a cognitive diagnostic model (CDM) such as the deterministic, inputs, noisy, “and” gate (Junker & Sijtsma, 2001) model is used. Results showed a significant disparity between proportions of correctly answering and mastering skills required to solve an item. Advantages of CDMs are discussed as well as a CDM-based method to filter distractor response categories that can aid instructors to diagnose a student's attribute mastery.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call