Abstract

Although the incremental Nystrom method has been used in kernel approximation, it is not suitable for online kernel learning due to the cubic time complexity and the lack of theoretical guarantees. In this paper, we propose a novel incremental Nystrom method, which is in a linear time complexity with respect to the sampling size at each round, and enjoys a sublinear regret bound for online kernel learning. We construct the intersection matrix using the ridge leverage score estimator, compute the rank-k approximation of the intersection matrix incrementally via the incremental singular value decomposition, and recalculate the generalized inverse matrix periodically. When applying the proposed incremental Nystrom method to online kernel learning, we approximate the kernel matrix using the updated generalized inverse matrix at each round, and formulate the explicit feature mapping by the singular value decomposition of the approximated kernel matrix, yielding the linear classifier for online kernel learning at each round. Theoretically, we prove that our incremental Nystrom method has a $(1+\epsilon)$ relative-error bound for kernel matrix approximation, enjoys a sublinear regret bound using online gradient descent method for online kernel learning, and reduces the time complexity of generalized inverse computation from $O(m^{3})$ to $O(mk)$ at each round, where $m$ is the sampling size and $k$ is the truncated rank. Experimental results show that the proposed incremental Nystrom method is accurate and efficient in kernel matrix approximation and is suitable for online kernel learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call