Abstract
Gaussian mixture models (GMM) remain popular in pattern classification applications due to their well understood Bayesian framework and the availability of good training algorithms such as the expectation maximization (EM) algorithm. EM is a non-discriminative training algorithm. The performance of a GMM trained with the EM algorithm can often fall short of other discriminative pattern classification techniques such as support vector machines (SVM) and artificial neural network (ANN) architectures such as deep networks and extreme learning machines (ELM). In this paper a discriminative training method based on the Moore-Penrose pseudo-inverse, often used in the ELM, is applied to the GMM classifier first trained with the EM algorithm. It is shown that on a number of benchmark pattern classification problems the proposed method improves accuracy of the GMM classifier significantly and produces results that are comparable to the SVM or ELM. The advantages of the proposed method are that there are no tunable parameters and the training is straightforward and fast.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have