Abstract

By training different models and averaging their predictions, the performance of the machine-learning algorithm can be improved. The performance optimization of multiple models is supposed to generalize further data well. This requires the knowledge transfer of generalization information between models. In this article, a multiple kernel mutual learning method based on transfer learning of combined mid-level features is proposed for hyperspectral classification. Three-layer homogenous superpixels are computed on the image formed by PCA, which is used for computing mid-level features. The three mid-level features include: 1) the sparse reconstructed feature; 2) combined mean feature; and 3) uniqueness. The sparse reconstruction feature is obtained by a joint sparse representation model under the constraint of three-scale superpixels' boundaries and regions. The combined mean features are computed with average values of spectra in multilayer superpixels, and the uniqueness is obtained by the superposed manifold ranking values of multilayer superpixels. Next, three kernels of samples in different feature spaces are computed for mutual learning by minimizing the divergence. Then, a combined kernel is constructed to optimize the sample distance measurement and applied by employing SVM training to build classifiers. Experiments are performed on real hyperspectral datasets, and the corresponding results demonstrated that the proposed method can perform significantly better than several state-of-the-art competitive algorithms based on MKL and deep learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call