Abstract

Hash learning is a hot topic since it can save storage space and perform fast retrieval. One of the most representative hashing methods is Supervised Discrete Hashing (SDH). However, there exist several problems in SDH. First, the potential of sparse feature extraction has been overlooked in the SDH-based methods. Second, SDH is incapable of preventing large information loss between the binary codes and the features. Third, the discrete cyclic coordinate descent (DCC) method is time-consuming. To address the aforementioned issues, we propose a novel regression model for hash learning named Jointly Sparse Fast Hashing (JSFH). By designing an orthogonal transformation, we introduce the jointly sparse regression into hash learning. The proposed method successfully integrates the feature extraction and hash learning into a unified framework, the original data and the label information can be comprehensively utilized such that the binary code generated by our method can be more discriminative with less information loss. Furthermore, we adopt an iterative algorithm for optimization, the optimization problem has closed-form solutions for each variable such that the designed iterative algorithm is highly efficient compared to the classic supervised hashing methods. The experimental results on four large-scale datasets demonstrate its superior performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call