Abstract

Solving a kernel regression problem usually suffers from expensive computation and storage costs due to the large kernel size. To tackle this problem, the Nyström method is proposed and widely applied to large-scale kernel methods as an approximate solution. The key idea of this method is to select a subset of columns of the kernel matrix and rebuilds a low-rank approximation to the dense kernel matrix. To reduce computational costs of sparse kernel regression, we take the merits of the Nyström approximation and present two non-uniform Nyström methods with theoretical guarantees for sparse kernel regression in this paper. In detail, we first provide an upper bound to the solution of sparse kernel regression via Nyström approximation. Based on this bound, we prove the upper bounds of the optimal solutions when adopting two notable non-uniform landmark selection strategies, including Determinantal Point Processes (DPPs) and Ridge Leverage Scores (RLS). Compared with the uniform Nyström method, we empirically demonstrate the superior performance of non-uniform Nyström in sparse kernel regression on a synthetic dataset and several real-world datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call