Abstract

Owing to sparseness, directly clustering high-dimensional data is still a challenge problem. Therefore, obtaining their low-dimensional compact representation by dimensional reduction is an effective method for clustering high-dimensional data. Most of existing dimensionality reduction methods, however, are developed originally for classification (such as Linear Discriminant Analysis) or recovering the geometric structure (known as manifold) of high-dimensional data (such as Locally Linear Embedding) rather than clustering purpose. Hence, a novel nonlinear discriminant clustering by dimensional reduction based on spectral regularization is proposed. The contributions of the proposed method are two folds: (1) it can obtain nonlinear low-dimensional representation that can recover the intrinsic manifold structure as well as enhance the cluster structure of the original high-dimensional data; (2) the clustering results can also be obtained in the dimensionality reduction procedure. Firstly, the desired low-dimensional coordinates are represented as linear combinations of predefined smooth vectors with respect to the data manifold, which are characterized by a weighted graph. Then, the optimal combination coefficients and the optimal cluster assignment matrix are computed by maximizing the ratio between the between-cluster scatter and the total scatter simultaneously as well as preserving the smoothness of the cluster assignment matrix with respect to the data manifold. Finally, the optimization problem is solved in an iterative procedure, which is proved to be convergent. Experiments on UCI data sets and real world data sets demonstrated the effectiveness of the proposed method for both clustering and visualization high-dimensional data set.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call