Abstract

Support vector machines (SVM), kernel principal component analysis (KPCA), and kernel Fisher discriminant analysis (KFD), are examples of successful kernel-based learning methods. By the addition of a regularizer and the kernel trick to a fuzzy counterpart of Gaussian mixture models (GMM), this paper proposes a clustering algorithm in an extended high dimensional feature space. Unlike the global nonlinear approaches, GMM or its fuzzy counterpart is to model nonlinear structure with a collection, or mixture, of local linear sub-models of PCA. When the number of feature vectors and clusters are n and c respectively, this kernel approach can find up to c × n nonzero eigenvalues. A way to control the number of parameters in the mixture of probabilistic principal component analysis (PPCA) is adopted to reduce the number of parameters. The algorithm provides a partitioning with flexible shape of clusters in the original input data space.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.