Abstract

Existing hashing methods normally define certain specific forms of hash functions, after which an objective function can be formulated to optimize the loss on training set to learn the parameters. However, in this way, the hash function will be tightly coupled with the generated objective in most cases. Moreover, since the objectives are generally formulated with binary quantization, most of them are nonconvex, which makes the optimization difficult and consequently decreases the similarity preserving performance of hashing. To solve this problem, we propose a novel pairwise correlation preserving framework to learn compact binary codes for hashing. First, we project each data into a metric space and represent it as a vector encoding the underlying local and global structure by pairwise correlation learning. Afterwards, pairwise correlation reconstruction (PCR), is further proposed to preserve the correlations of data between the metric space and the hamming space to learn binary codes. The PCR model is convex. Moreover, no specific hash functions are needed to be predefined and the steps of correlation learning and reconstruction are independent. The above characteristics make the optimization of PCR easily and efficiently, and thus leads to better preservation of data similarity in hamming space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call