Abstract

Dimensionality reduction has many applications in data visualization and machine learning. Existing methods can be classified into global ones and local ones. The global methods usually learn the linear relationship in data, while the local ones learn the manifold intrinsic geometry structure, which has a significant impact on pattern recognition. However, most of existing local methods obtain an embedding with eigenvalue or singular value decomposition, where the computational complexities are very high in a large amount of high-dimensional data. In this paper, we propose a local nonlinear dimensionality reduction method named Vec2vec, which employs a neural network with only one hidden layer to reduce the computational complexity. We first build a neighborhood similarity graph from the input matrix, and then define the context of data points with the random walk properties in the graph. Finally, we train the neural network with the context of data points to learn the embedding of the matrix. We conduct extensive experiments of data classification and clustering on nine image and text datasets to evaluate the performance of our method. Experimental results show that Vec2vec is better than several state-of-the-art dimensionality reduction methods, except that it is equivalent to UMAP on data clustering tasks in the statistical hypothesis tests, but Vec2vec needs less computational time than UMAP in high-dimensional data. Furthermore, we propose a more lightweight method named Approximate Vec2vec (AVec2vec) with little performance degradation, which employs an approximate method to build the neighborhood similarity graph. AVec2vec is still better than some state-of-the-art local dimensionality reduction methods and competitive with UMAP on data classification and clustering tasks in the statistical hypothesis tests.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.