Abstract

Gaussian mixture models (GMMs) and multilayer perceptron (MLP) are both popular pattern classification techniques. This brief shows that a multilayer perceptron with quadratic inputs (MLPQ) can accurately approximate GMMs with diagonal covariance matrices. The mapping equations between the parameters of GMM and the weights of MLPQ are presented. A similar approach is applied to radial basis function networks (RBFNs) to show that RBFNs with Gaussian basis functions and Euclidean norm can be approximated accurately with MLPQ. The mapping equations between RBFN and MLPQ weights are presented. There are well-established training procedures for GMMs, such as the expectation maximization (EM) algorithm. The GMM parameters obtained by the EM algorithm can be used to generate a set of initial weights of MLPQ. Similarly, a trained RBFN can be used to generate a set of initial weights of MLPQ. MLPQ training can be continued further with gradient-descent based methods, which can lead to improvement in performance compared to the GMM or RBFN from which it is initialized. Thus, the MLPQ can always perform as well as or better than the GMM or RBFN.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.