Abstract

Markov chain Monte Carlo (MCMC) methods, such as Gibbs sampling, present an alternative to marginal maximum likelihood (MML) estimation, which offers some promise for parameter estimation particularly with complex models, in small sample situations, and for other applications where MML algorithms have not been established. MCMC circumvents the problems associated with implementing an estimation algorithm for complex, multidimensional probability distributions by sampling the parameters from each of the one-dimensional conditional posterior distributions at each stage of the Markov chain. In this article, the authors compared the quality of item parameter estimates for MML and MCMC with one type of complex item response theory model, the nominal response model. The quality of item parameter recovery was nearly identical for both MML and MCMC. Both methods tended to produce good estimates, even for short tests and relatively small sample sizes. Parameter recovery was best for items of moderate dif.culty (i.e., items matched to the latent trait distribution); recovery was worst for items that were extremely easy or dif.cult. The quality of item parameter recovery improved as test length increased from 10 to 30 items, but did not change as test length increased from 20 to 30 items. MCMC estimation takes substantially longer but appears to be a good surrogate for MML for those situations for which an MML algorithm has not been developed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call