Abstract

To overcome the computational burden of quadratic programming in kernel expectile regression (KER), iteratively reweighted least square (IRLS) technique was introduced in literature, resulting in IRLS-KER. However, for nonlinear models, IRLS-KER involves operations with matrices and vectors of the same size as the training set. Thus, as the training set becomes large, nonlinear IRLS-KER needs a long training time and large memory. To further alleviate the training cost, this paper projects the original data into a low-dimensional space via random Fourier feature. The inner product of the random Fourier features of two data points is approximately the same as the kernel function evaluated at these two data points. Hence, it is possible to use a linear model in the new low-dimensional space to approximate the original nonlinear model, and consequently, the time/memory efficient linear training algorithms could be applied. This paper applies the idea of random Fourier features to IRLS-KER, and our testing results on simulated and real-world datasets show that, the introduction of random Fourier features makes IRLS-KER achieve similar prediction accuracy as the original nonlinear version with substantially higher time efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call