Abstract

Dimensionality reduction is an important issue for numerous applications including biomedical images analysis and living system analysis. Neighbor embedding, those representing the global and local structure as well as dealing with multiple manifolds, such as the elastic embedding techniques, can go beyond traditional dimensionality reduction methods and find better optima. Nevertheless, existing neighbor embedding algorithms can not be directly applied in classification as suffering from several problems: (1) high computational complexity, (2) nonparametric mappings, and (3) lack of class labels information. We propose a supervised neighbor embedding called discriminative elastic embedding (DEE) which integrates linear projection matrix and class labels into the final objective function. In addition, we present the Laplacian search direction for fast convergence. DEE is evaluated in three aspects: embedding visualization, training efficiency, and classification performance. Experimental results on several benchmark databases present that the proposed DEE exhibits a supervised dimensionality reduction approach which not only has strong pattern revealing capability, but also brings computational advantages over standard gradient based methods.

Highlights

  • The classification of high-dimensional data, such as biological characteristic sequences, high-definite images, and gene expressions, remains a difficult task [1]

  • Four methods are compared for discriminative elastic embedding (DEE): gradient descent (GD), used in stochastic neighbor embedding (SNE); conjugate gradients (CG), used in t-SNE; fixed-point (FP), used in EE; and the Laplacian direction (LD), presented in this paper

  • Afterwards, we demonstrate the effectiveness of DEE in clustering visualization compared with some classical algorithms such as t-SNE, discriminative stochastic neighbor embedding (DSNE), and EE

Read more

Summary

Introduction

The classification of high-dimensional data, such as biological characteristic sequences, high-definite images, and gene expressions, remains a difficult task [1]. The goal of DR focuses on constructing a lowdimensional representation of data in order to achieve better discrimination and accelerate the subsequent processing. In this realm, very straightforward algorithms are dominated, as the computational complexity of advanced DR techniques proposed is too high. MMPA takes into account both intraclass and interclass geometries and possesses the orthogonality property for the projection matrix. Speaking, all these methods have a unique solution computed by a generalized eigensolver and exhibit acceptable performance on most data, but they

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call