Abstract

This paper considers a least square regularized regression algorithm for multi-task learning in a union of reproducing kernel Hilbert spaces (RKHSs) with Gaussian kernels. It is assumed that the optimal prediction function of the target task and those of related tasks are in an RKHS with the same but with unknown Gaussian kernel width. The samples for related tasks are used to select the Gaussian kernel width, and the sample for the target task is used to obtain the prediction function in the RKHS with this selected width. With an error decomposition result, a fast learning rate is obtained for the target task. The key step is to estimate the sample errors of related tasks in the union of RKHSs with Gaussian kernels. The utility of this algorithm is illustrated with one simulated data set and four real data sets. The experiment results illustrate that the underlying algorithm can result in significant improvements in prediction error when few samples of the target task and more samples of related tasks are available.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.