Abstract

Traditionally, teachers evaluate students’ abilities via their total test scores. Recently, cognitive diagnostic models (CDMs) have begun to provide information about the presence or absence of students’ skills or misconceptions. Nevertheless, CDMs are typically applied to tests with multiple-choice (MC) items, which provide less diagnostic information than constructed-response (CR) items. This paper introduces new CDMs for tests with both MC and CR items, and illustrates how to use them to analyse MC and CR data, and thus, identify students’ skills and misconceptions in a mathematics domain. Analyses of real data, the responses of 497 sixth-grade students randomly selected from four Taiwanese primary schools to eight direct proportion items, were conducted to demonstrate the application of the new models. The results show that the new models can better determine students’ skills and misconceptions, in that they have higher inter-rater agreement rates than traditional CDMs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call