Abstract
Diagnostic classification models (DCMs) are restricted latent class models with a set of cross-class equality constraints and additional monotonicity constraints on their item parameters, both of which are needed to ensure the meaning of classes and model parameters. In this paper, we develop an efficient, Gibbs sampling-based Bayesian Markov chain Monte Carlo estimation method for general DCMs with monotonicity constraints. A simulation study was conducted to evaluate parameter recovery of the algorithm which showed accurate estimation of model parameters. Moreover, the proposed algorithm was compared to a previously developed Gibbs sampling algorithm which imposed constraints on only the main effect item parameters of the log-linear cognitive diagnosis model. The newly proposed algorithm showed less bias and faster convergence. An analysis of the 2000 Programme for International Student Assessment reading assessment data using this algorithm was also conducted.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.