Abstract

This paper presents a preconditioner-based method for solving a kernel ridge regression problem. In contrast to other methods, which utilize either fast matrix-vector multiplication or a preconditioner, the suggested approach uses randomized matrix decompositions for building a preconditioner with a special structure that can also utilize fast matrix-vector multiplications. This hybrid approach is efficient in reducing the condition number, exact, and computationally efficient, enabling the processing of large datasets with computational complexity linear to the number of data points. Also, a theoretical upper bound for the condition number is provided. For Gaussian kernels, we show that given a desired condition number, the rank of the needed preconditioner can be determined directly from the dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call