Abstract

In recent years, different methods based on kernels have been used with success in a variety of tasks such as classification. However, in the typical use of these methods, the choice of the optimal kernel is crucial to improve the performance of a specific task. So, instead of selecting a single kernel, multiple kernel learning (MKL) has been proposed which uses a combination of kernels, where the weight of each kernel is optimized in the training stage. MKL methods use kernels in linear, nonlinear or data-dependent combinations. Methods based on MKL have performed better than methods using a single kernel such as the Support Vector Machine (SVM). In this article, we propose a new MKL method, which is based on a local (data dependent) and nonlinear combination of different kernels using a gating model for selecting the appropriate kernel function. We call our proposal as localized nonlinear multiple kernel learning (LNLMKL). In our experiments for binary microarray classification, different kernels were used in SVM and different kernels combinations were used for our proposal and for the other MKL methods. Finally, we report the results of these experiments using eight high-dimensional microarray data sets demonstrating that our proposal have performed better than the other methods analyzed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call