Abstract

Current methods for fitting cognitive diagnosis models (CDMs) to educational data typically rely on expectation maximization (EM) or Markov chain Monte Carlo (MCMC) for estimating the item parameters and examinees’ proficiency class memberships. However, for advanced, more complex CDMs like the reduced reparameterized unified model (Reduced RUM) and the (saturated) loglinear cognitive diagnosis model (LCDM), EM and Markov chain Monte Carlo (MCMC) have the reputation of often consuming excessive CPU times. Joint maximum likelihood estimation (JMLE) is proposed as an alternative to EM and MCMC. The maximization of the joint likelihood is typically accomplished in a few iterations, thereby drastically reducing the CPU times usually needed for fitting advanced CDMs like the Reduced RUM or the (saturated) LCDM. As another attractive feature, the JMLE algorithm presented here resolves the traditional issue of JMLE estimators—their lack of statistical consistency—by using an external, statistically consistent estimator to obtain initial estimates of examinees’ class memberships as starting values. It can be proven that under this condition the JMLE item parameter estimators are also statistically consistent. The computational performance of the proposed JMLE algorithm is evaluated in two comprehensive simulation studies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.