Abstract

Kernel-based discriminant analysis is an effective nonlinear mechanism for pattern analysis. Conventional kernel-based discriminant analysis mainly based on a single kernel function may be insufficient when dealing with datasets with complicated geometric structures. A combination of multiple kernels is able to represent the complementary information of the original data from multiple views and thereby improves recognition performance. However, the discriminant analysis methods based on the combination of multiple kernels face the challenges of optimizing the weights of the “base kernels” and the heavy computational burden. To address these challenges, this paper proposes a novel multi-kernel discriminant analysis method based on support vectors (MKDASV) to represent the data structure more effectively by incorporating the between-class and within-class information. First, the multi-kernel SVM algorithm is utilized to obtain the weight of each “base kernel” and the support vectors; and then the criteria of discriminant analysis method are constructed by taking into account both the margin maximizing classification theory of SVM and the expression of the within-class scatter in LDA algorithm; and finally, to effectively reduce the amount of computation, only the support vectors are used as the training samples to participate in the dimensionality reduction operation. The experimental results on six standard databases validated that our proposed method outperformed the other five methods in terms of classification accuracy and the computational efficiency as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call