Abstract
Saturated diagnostic classification models (DCM) can flexibly accommodate various relationships among attributes to diagnose individual attribute mastery, and include various important DCMs as sub-models. However, the existing formulations of the saturated DCM are not better suited for deriving conditionally conjugate priors of model parameters. Because their derivation is the key in developing a variational Bayes (VB) inference algorithm, in the present study, we proposed a novel mixture formulation of saturated DCM. Based on it, we developed a VB inference algorithm of the saturated DCM that enables us to perform scalable and computationally efficient Bayesian estimation. The simulation study indicated that the proposed algorithm could recover the parameters in various conditions. It has also been demonstrated that the proposed approach is particularly suited to the case when new data become sequentially available over time, such as in computerized diagnostic testing. In addition, a real educational dataset was comparatively analyzed with the proposed VB and Markov chain Monte Carlo (MCMC) algorithms. The result demonstrated that very similar estimates were obtained between the two methods and that the proposed VB inference was much faster than MCMC. The proposed method can be a practical solution to the problem of computational load.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.