Abstract

Finite mixture models are a widely known method for modelling data that arise from a heterogeneous population. Within the family of mixtures of regression models, mixtures of linear mixed models have also been applied in different areas since, besides taking into consideration the heterogeneity in the population, they also allow to take into account the correlation between observations from the same individual. One of the main issues in mixture models concerns the estimation of the parameters. Maximum likelihood estimation is one of the most used methods in the estimation of the parameters for mixture models. However, the maximization of the log-likelihood function in mixture models is complex, producing in many cases infinite solutions whereby the maximum likelihood estimator may not exist, at least globally. For this reason, it is common to resort to iterative methods, in particular to the Expectation-Maximization (EM) algorithm. However, the slow convergence and the selection of initial values are two of biggest issues of the EM algorithm, the reason why some modified versions of this algorithm have been developed over the years. In this article we compare the performance of the EM, Classification EM (CEM) and Stochastic EM (SEM) algorithms in the estimation of the parameters for mixtures of linear mixed models. In order to evaluate their performance, we carry out a simulation study and a real data application. The results show that the CEM algorithm is the least computationally demanding algorithm, although the three algorithms provide similar maximum likelihood estimates for the parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call