Abstract

The problems of improving computational efficiency and extending representational capability are the two hottest topics in approaches of global manifold learning. In this paper, a new method called extensive landmark Isomap (EL-Isomap) is presented, addressing both topics simultaneously. On one hand, originated from landmark Isomap (L-Isomap), which is known for its high computational efficiency property, EL-Isomap also possesses high computational efficiency through utilizing a small set of landmarks to embed all data points. On the other hand, EL-Isomap significantly extends the representational capability of L-Isomap and other global manifold learning approaches by utilizing only an available subset from the whole landmark set instead of all to embed each point. Particularly, compared with other manifold learning approaches, the data manifolds with intrinsic low-dimensional concave topologies and essential loops can be unwrapped by the new method more successfully, which are shown by simulation results on a series of synthetic and real-world data sets. Moreover, the accuracy, robustness, and computational complexity of EL-Isomap are analyzed in this paper, and the relation between EL-Isomap and L-Isomap is also discussed theoretically.

Highlights

  • Nonlinear dimensionality reduction (NLDR) is an attractive topic in many scientific fields [1,2,3,4]

  • This section mainly aims at demonstrating performance of EL-Isomap on representational capacity extending, by comparisons with Isomap, L-Isomap, LLE, and Laplacian eigenmap

  • The first applied data set in this series of simulations is composed of 3000 points sampled from a lowdimensional manifold with the shape shown as Figure 1(a)

Read more

Summary

Introduction

Nonlinear dimensionality reduction (NLDR) is an attractive topic in many scientific fields [1,2,3,4]. Based on the intrinsic construction principles, these approaches can be divided into two categories: global and local approaches Global approaches, such as Isomap [1] and CDA [10], attempt to preserve geometry at both local and global scales, essentially constructing entire isometric corresponding between all data pairs in the original and latent spaces. Local approaches, such as LLE [2] and Laplacian eigenmaps [11], attempt to preserve the local geometry of the data, intrinsically keeping invariance between all local areas in the original and latent spaces

Objectives
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call