Abstract

Multi-label classification aims to construct prediction models from input space to output space for multi-label datesets. However, the feature space of multi-label dataset and the hypothesis space of classifier are always complex. In order to obtain high performance multi-kernel learning algorithms, the core problem is to design algorithms that compress both hypothesis space and feature space. In this paper, by combining the local Rademacher complexity and Hilbert-Schmidt independence criterion (HSIC), an effective multi-kernel learning algorithm for multi-label classification is proposed to compress the feature space and hypothesis space simultaneously. Based on the tail sum of eigenvalues of the integral operator corresponding to kernels, the upper bound of local Rademacher complexity of linear functions class in the feature space is determined. The Hilbert-Schmidt independence criterion is applied to maximize the correlation between the convex combination of the input kernel matrices and the ideal kernel matrix in the label space. Finally, the Laplace regularization model is built by controlling the tail sum of eigenvalues of the integral operator corresponding to kernels of this criterion. Therefore, the above process obtains a combined kernel, simultaneously by sorting the coefficients of the combined kernel, a feature selection algorithm can be constructed. On the basis, the combined kernel can also be used to design a new binary classifier for multi-label classification. The experimental results verify that our proposed multi-kernel learning algorithms can effectively compress the hypothesis space, while the feature selection algorithms can effectively compress the feature space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call