Abstract

Tensors are powerful tools for representing and processing multidimensional data. In this paper, an algorithm of dimension reduction for multidimensional data is proposed. We consider both the global and local structures of the data to fully extract important features. Multiplying a tensor by a matrix in mode-n can change the size of a certain dimension of the tensor. Therefore, for the global structure, we use a matrix to span a subspace to minimize the distance between the tensor data and its projection on the subspace for each mode. For local structure, we construct a dimensionality reduction algorithm for tensor data based on local homeomorphism to keep the continuous dependency relationship of the original high-dimensional tensor data unchanged after dimensionality reduced in each locality. The proposed algorithm achieves dimensionality reduction by combining the global subspace projection distance minimum and local homeomorphism, which realizes global variance maximization and maintains the local non-linear geometric structure at the same time. The proposed algorithm performs good feasibility while compared with other advanced algorithms in classification and clustering experiments.

Highlights

  • With the advancement of data acquisition technologies and the increasing availability of powerful sensors, high dimension data are generated from different application fields, such as social networks, computer vision, and communication networks

  • This paper proposes a tensor data dimensionality reduction algorithm based on local homeomorphism and global subspace projection distance minimum (LHGPD), which is to combine (25) and (31)

  • The LHGPD model indicates that the selection of the subspace W of the algorithm should consider two aspects, i.e., the distance between the high-dimensional tensor data X (N) and its projection on the subspace is minimized, the coordinates of the high-dimensional tensor data set projected in the subspace are the most conducive for local homeomorphism

Read more

Summary

INTRODUCTION

With the advancement of data acquisition technologies and the increasing availability of powerful sensors, high dimension data are generated from different application fields, such as social networks, computer vision, and communication networks. By combining the manifold term with tensor decomposition, Li et al [26] proposed a model combining (MR-NTD) the manifold term with nonnegative tensor decomposition and Jiang et al [27] used orthogonal constrain to factor matrixes and one of the orthogonal matrices served as the low-dimensional representation of the data (GLTD). We propose a tensor dimensionality reduction algorithm that considers both the global and local structures of the data. The algorithm uses the K-nearest neighbor criterion to divide the original high dimensional tensor data into individual parts, maps the local data to the corresponding tangent space, and aligns the local coordinates with the global coordinates through the affine matrix to learn the internal structure of the data.

NOTATIONS AND PRELIMINARIES
TENSOR MATRICIZATION AND MULTIPLICATION
THEOREM
MANIFOLD LEARNING
PROPOSED ALGORITHM
GLOBAL SUBSPACE PROJECTION
COMPUTATIONAL COMPLEXITY ANALYSIS
EXPERIMENTS
DATA CLASSIFICATION
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call