Abstract

High-dimensional $k$ nearest neighbor (kNN) search has a wide range of applications in multimedia information retrieval. Existing disk-based $k$ NN search methods incur significant I/O costs in the candidate refinement phase. In this paper, we propose to cache compact approximate representations of data points in main memory in order to reduce the candidate refinement time during $k$ NN search. This problem raises two challenging issues: (i) which is the most effective encoding scheme for data points to support $k$ NN search? and (ii) what is the optimal number of bits for encoding a data point? For (i), we formulate and solve a novel histogram optimization problem that decides the most effective encoding scheme. For (ii), we develop a cost model for automatically tuning the optimal number of bits for encoding points. In addition, our approach is generic and applicable to exact / approximate $k$ NN search methods. Extensive experimental results on real datasets demonstrate that our proposal can accelerate the candidate refinement time of $k$ NN search by at least an order of magnitude.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call