Existing multiple kernel learning (MKL) algorithms indiscriminately apply the same set of kernel combination weights to all samples by pre-specifying a group of base kernels. Sample-adaptive MKL learning (SAMKL) overcomes this limitation by adaptively switching on/off the base kernels with respect to each sample. However, it restricts to solving MKL problems with pre-specified kernels. And, the formulation of existing SAMKL falls to an $\ell _{1}$ -norm MKL which is not flexible. To allow for robust kernel mixtures that generalize well in practical applications, we extend SAMKL to the arbitrary norm and apply it to image classification. In this paper, we formulate a closed-form solution for optimizing the kernel weights based on the equivalence between group-lasso and MKL, and derive an efficient $\ell _{q}$ -norm ( $q\geq 1$ and denoting the $\ell _{q}$ -norm of kernel weights) SAMKL algorithm. The cutting plane method is used to solve this margin maximization problem. Besides, we propose a framework for solving MKL problems in image classification. Experimental results on multiple data sets show the promising performance of the proposed solution compared with other competitive methods.
Read full abstract