Abstract

Randomized dimensionality reduction has been recognized as one of the cornerstones in handling high-dimensional data, originating in various foundational works such as the celebrated Johnson-Lindenstrauss Lemma. More specifically, nearest neighbor-preserving embeddings exist for ℓ2 (Euclidean) and ℓ1 (Manhattan) metrics, as well as doubling subsets of ℓ2, where doubling dimension is today the most effective way of capturing intrinsic dimensionality, as well as input structure in various applications. These randomized embeddings bound the distortion only for distances between the query point and a point set. Motivated by the foundational character of fast Approximate Nearest Neighbor search in ℓ1, this paper settles an important missing case, namely that of doubling subsets of ℓ1. In particular, we introduce a randomized dimensionality reduction by means of a near neighbor-preserving embedding, which is related to the decision-with-witness problem. The input set gets represented with a carefully chosen covering point set; in a second step, the algorithm randomly projects the latter. In order to obtain the covering point sets, we leverage either approximate r-nets or randomly shifted grids, with different tradeoffs between preprocessing time and target dimension. We exploit Cauchy random variables, and derive a concentration bound of independent interest. Our algorithms are rather simple and should therefore be useful in practice.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.