Abstract

Dimensionality Reduction (DR) is useful to understand high-dimensional data. It attracts wide attention from industry and academia and is employed in areas such as machine learning, data mining, and pattern recognition. This work presents a geometric approach to DR termed Polygonal Coordinate System (PCS), capable of representing multidimensional data in two or three dimensions while preserving their inherent overall structure by taking advantage of a polygonal interface bridging high- and low-dimensional spaces. PCS can handle Big Data by adopting an incremental, geometric DR with linear-time complexity. A new version of t-Distributed Stochastic Neighbor Embedding (t-SNE), a state-of-the-art algorithm for DR, is also provided. It employs a PCS-based deterministic strategy and is named t-Distributed Deterministic Neighbor Embedding (t-DNE). Several synthetic and real data sets were used as well-known real-world problem archetypes in our benchmark, providing a means to evaluate PCS and t-DNE against four embedding-based DR algorithms: two linear-transformation ones (Principal Component Analysis and Non-negative Matrix Factorization) and two nonlinear ones (t-SNE and Sammon’s Mapping). Statistical comparisons of the execution times of these algorithms, by the Friedman’s significance test, highlight the efficiency of PCS in data embedding. PCS tends to surpass its counterparts in several aspects explored in this work, including asymptotic time and space complexity, preservation of global data-inherent structures, number of hyperparameters, and applicability to unobserved data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call