Abstract

Gaussian mixture models (GMM) remain popular in pattern classification applications due to their well understood Bayesian framework and the availability of good training algorithms such as the expectation maximization (EM) algorithm. EM is a non-discriminative training algorithm. The performance of a GMM trained with the EM algorithm can often fall short of other discriminative pattern classification techniques such as support vector machines (SVM) and artificial neural network (ANN) architectures such as deep networks and extreme learning machines (ELM). In this paper a discriminative training method based on the Moore-Penrose pseudo-inverse, often used in the ELM, is applied to the GMM classifier first trained with the EM algorithm. It is shown that on a number of benchmark pattern classification problems the proposed method improves accuracy of the GMM classifier significantly and produces results that are comparable to the SVM or ELM. The advantages of the proposed method are that there are no tunable parameters and the training is straightforward and fast.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.