Abstract

Manifold learning tries to find low-dimensional manifolds on high-dimensional data. It is useful to omit redundant data from input. Linear manifold learning algorithms have applicability for out-of-sample data, in which they are fast and practical especially for classification purposes. Locality preserving projection (LPP) and orthogonal locality preserving projection (OLPP) are two known linear manifold learning algorithms. In this study, scatter information of a distance matrix is used to construct a weight matrix with a supervised approach for the LPP and OLPP algorithms to improve classification accuracy rates. Low-dimensional data are classified with SVM and the results of the proposed method are compared with some other important existing linear manifold learning methods. Class-based enhancements and coefficients proposed for the formulization are reported visually. Furthermore, the change on weight matrices, band information, and correlation matrices with p-values are extracted and visualized to understand the effect of the proposed method. Experiments are conducted on hyperspectral imaging (HSI) with two different datasets. According to the experimental results, application of the proposed method with the LPP or OLPP algorithms outperformed traditional LPP, OLPP, neighborhood preserving embedding (NPE) and orthogonal neighborhood preserving embedding (ONPE) algorithms. Furthermore, the analytical findings on visualizations show consistency with obtained classification accuracy enhancements.

Highlights

  • Published: 28 September 2021Dimension reduction (DR) techniques are used to obtain new feature sets in a subspace from high-dimensional original data

  • Manifold learning is a special kind of technique for DR, which assumes that there is a low-dimensional manifold on high-dimensional data and its purpose is to find this manifold [3]

  • The most successful linear manifold learning method is shown in bold type

Read more

Summary

Introduction

Dimension reduction (DR) techniques are used to obtain new feature sets in a subspace from high-dimensional original data. This helps to omit the redundant data and noise from the input and provides advantages, especially in terms of computational time and data storage [1,2]. Manifold learning aims to preserve local or global geometric relations and not lose the graph topology of high-dimensional data [5,6,7]. Isometric feature mapping (Isomap), locally linear embedding (LLE), and Laplacian eigenmaps (LE) are the most widely utilized nonlinear manifold learning methods. While Isomap is a global approach [8], LE [9] and LLE [10] retain the local geometry of data. An Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call