Abstract

The performance of kernel-based dimensionality reduction heavily relies on the selection of kernel functions. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a convex combination from a set of base kernels. But this method relaxes a nonconvex quadratically constrained quadratic programming (QCQP) problem into a semi-definite programming (SDP) problem to specify the kernel weights, which might lead to its performance degradation. Although a trace ratio maximization approach to multiple-kernel based dimensionality reduction (MKL-TR) has been presented to avoid convex relaxation, it has to compute a generalized eigenvalue problem in each iteration of its algorithm, which is expensive in both time and memory. To improve the performance of these methods further, this paper proposes a novel multiple kernel dimensionality reduction method by virtue of spectral regression and trace ratio maximization, termed as MKL-SRTR. The proposed approach aims at learning an appropriate kernel from the multiple base kernels and a transformation into a lower dimensionality space efficiently and effectively. The experimental results demonstrate the effectiveness of the proposed method in benchmark datasets for supervised, unsupervised as well as semi-supervised scenarios.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.