Abstract

Multiple Kernel Learning (MKL) has been demonstrated to improve classification performance effectively. But it will cause a large complexity in some large-scale cases. In this paper, we aim to reduce both the time and space complexities of MKL, and thus propose an efficient multi-kernel classification machine based on the Nyström approximation. Firstly, we generate different kernel matrices Kps for given data. Secondly, we apply the Nyström approximation technique into each Kp so as to obtain its corresponding approximation matrix K∼p. Thirdly, we fuse multiple generated K∼ps into the final ensemble matrix G∼ with one certain heuristic rule. Finally, we select the Kernelized Modification of Ho–Kashyap algorithm with Squared approximation of the misclassification errors (KMHKS) as the incorporated paradigm, and meanwhile apply the G∼ into KMHKS. In doing so, we propose a multi-kernel classification machine with reduced complexity named Nyström approximation matrix with Multiple KMHKSs (NMKMHKS). The experimental results here validate both the effectiveness and efficiency of the proposed NMKMHKS. The contributions of NMKMHKS are that: (1) compared with the existing MKL, NMKMHKS reduces the computational complexity of finding the solution scale from O(Mn3) to O(Mnm2), where M is the number of kernels, n is the number of training samples, and m is the number of the selected columns from Kp. Meanwhile, NMKMHKS reduces the space complexity of storing the kernel matrices from O(Mn2) to O(n2); (2) compared with the original KMHKS, NMKMHKS improves the classification performance but keeps a comparable space complexity; (3) the better recognition of NMKMHKS can be got in a strong correlation between multiple used Kps; and (4) NMKMHKS has a tighter generalization risk bound in terms of the Rademacher complexity analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call