Abstract

Kernel-based learning methods have been widely used in various machine learning tasks such as dimensionality reduction, classification and regression. Because the performance of kernel-based learning methods depends on the selection of kernels, how to optimise kernel functions becomes an important issue in kernel-based learning methods. A novel formulation for automatically learning kernels over a linear combination of kernel functions in terms of discriminant criteria is proposed. One not only extracts features, but also carries out the selection of kernels when optimising the discriminant criteria. It is found that the proposed method is available for any discriminant criterion formulated in a pairwise manner as the objective function. Therefore the proposed method can provide a framework for optimising multiple kernel subspace analysis. Extensive experiments on UCI data sets, handwritten numerical characters, face images and gene data sets are implemented to demonstrate the effectiveness of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.