Abstract

Advances in non-linear dimensionality reduction provide a way to understand and visualise the underlying structure of complex datasets. The performance of large-scale non-linear dimensionality reduction is of key importance in data mining, machine learning, and data analysis. In this paper, we concentrate on improving the performance of non-linear dimensionality reduction using large-scale datasets on the GPU. In particular, we focus on solving problems including k -nearest neighbour (KNN) search and sparse spectral decomposition for large-scale data, and propose an efficient framework for local linear embedding (LLE). We implement a k-d tree-based KNN algorithm and Krylov subspace method on the GPU to accelerate non-linear dimensionality reduction for large-scale data. Our results enable GPU-based k-d tree LLE processes of up to about 30–60´ faster compared to the brute force KNN (Hernandez et al., 2007) LLE model on the CPU. Overall, our methods save O ( n 2 – 6 n – 2 k – 3) memory space.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.