Abstract
This chapter is dedicated to nonparametric modeling of nonlinear functions in reproducing kernel Hilbert spaces (RKHS). The basic definitions and concepts behind RKHSs are presented, including positive definite kernels, reproducing kernels, kernel matrices, and the kernel trick. Cover's theorem and the representer theorem are introduced. Then, kernel ridge regression, support vector regression, and support vector machines are studied. The concept of random Fourier features for kernel approximation is introduced and its application to online and distributed learning is discussed. The notion of multiple kernel learning is presented and a discussion on sparse modeling for nonparametric models in the context of additive models is provided. The chapter closes with a case study for text authorship identification via the use of string kernels.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.