Abstract

We propose a novel method, called Kernel Neighborhood Discriminant Analysis (KNDA), which can be regarded as a supervised kernel extension of Locality Preserving Projection (LPP). KNDA nonlinearly maps the original data into a kernel space in which two graphs are constructed to depict the within-class submanifold and the between-class submanifold. Then a criterion function which minimizes the quotient between the within-class representation and the between-class representation of the submanifolds is designed to separate each submanifold constructed by each class. The real contribution of this paper is that we bring and extend the submanifold based algorithm to a general model and by some derivation a simple result is given by which we can classify a given object to a predefined class effectively. Experiments on the MNIST Handwritten Digits database, the Binary Alphadigits database, the ORL face database, the Extended Yale Face Database B, and a downloaded documents dataset demonstrate the effectiveness and robustness of the proposed method.

Highlights

  • In many practical applications, such as data mining, machine learning, and computer vision, the dimensionality reduction is a necessary preprocessing step for the purpose of noise reduction and reducing the computation complexity

  • We propose a novel method, called Kernel Neighborhood Discriminant Analysis (KNDA), which can be regarded as a supervised kernel extension of Locality Preserving Projection (LPP)

  • By using a different locally geometrical intuition, we proposed a novel submanifold learning method, called Kernel Neighborhood Discriminant Analysis (KNDA), which is based on the kernel tricks [21,22,23]

Read more

Summary

Introduction

In many practical applications, such as data mining, machine learning, and computer vision, the dimensionality reduction is a necessary preprocessing step for the purpose of noise reduction and reducing the computation complexity. LPP has a strong principle base and can project the high dimensional data into a low dimensional space with the local geometry structure of the original manifold being preserved, it does not use the class relationship of the data points, which is more important in pattern classification. This step is somewhat like the LPP, we construct an affinity graph, each vertex of which is a data point in the kernel space. Since Kernel Neighborhood Discriminant Analysis is designed to preserve the within-class geometry structure of all the submanifolds and keep away the between-class submanifold, we define the criterion function as follows: arg min This problem can be reformulated as a constrained minimization problem: min μTK (Dmin − Wmin) Kμ (17).

Experiments and Discussions
Digit Visualization
Face Recognition
Discussions
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call