Abstract

In this letter, we propose a sequential training scheme for multikernel support vector regression (SVR). Unlike the multistage backfitting technique; our method re-tunes, at every stage, all previously trained weights using a semiparametric algorithm in the presence of one more kernel function. In this way, local minima are avoided and any combination of arbitrary kernel functions is acceptable. By experimenting on some synthetic and real data sets, we demonstrate that our method yields a better trade-off between sparsity and accuracy in comparison with the conventional single-kernel SVR and the multikernel backfitting SVR.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call