Abstract

Recently the kernel discriminant analysis (KDA) has been successfully applied in many applications. KDA is one of the nonlinear extensions of Linear Discriminant Analysis (LDA). But the kernel function is usually defined a priori and it is not known what the optimum kernel function for nonlinear discriminant analysis is. Otsu derived the optimum nonlinear discriminant analysis (ONDA) by assuming the underlying probabilities similar with the Bayesian decision theory. Kurita derived discriminant kernels function (DKF) as the optimum kernel functions in terms of the discriminant criterion by investigating the optimum discriminant mapping constructed by the ONDA. The derived kernel function is given by using the Bayesian posterior probabilities. For real applications we can define a family of discriminant kernel functions by changing the estimation method of the Bayesian posterior probabilities. In this paper, we propose and evaluate the support vector machine (SVM) in which the discriminant kernel functions are used. We call this SVM the discriminat-kernel-based support vector machine (DKSVM). In the experiments, we compare the proporsed DKSVM with the usual SVM.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call