Abstract

AbstractThere has been growing interest in kernel methods for classification, clustering and dimension reduction. For example, kernel Fisher discriminant analysis, spectral clustering and kernel principal component analysis are widely used in statistical learning and data mining applications. The empirical success of the kernel method is generally attributed to nonlinear feature mapping induced by the kernel, which in turn determines a low dimensional data embedding. It is important to understand the effect of a kernel and its associated kernel parameter(s) on the embedding in relation to data distributions. In this paper, we examine the geometry of the nonlinear embedding for kernel principal component analysis (PCA) when polynomial kernels are used. We carry out eigen‐analysis of the polynomial kernel operator associated with data distributions and investigate the effect of the degree of polynomial. The results provide both insights into the geometry of nonlinear data embedding and practical guidelines for choosing an appropriate degree for dimension reduction with polynomial kernels. We further comment on the effect of centering kernels on the spectral property of the polynomial kernel operator. © 2013 Wiley Periodicals, Inc. Statistical Analysis and Data Mining, 2013

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.