Abstract
Subspace is widely used to represent objects under different viewpoints, illuminations, identities, and more. Due to the growing amount and dimensionality of visual contents, fast search in a large-scale database with high-dimensional subspaces is an important task in many applications, such as image retrieval, clustering, video retrieval, and visual recognition. This can be facilitated by approximate nearest subspace (ANS) search which requires effective subspace representation. All existing methods for this problem represent a subspace by a point in the Euclidean or the Grassmannian space before applying the approximate nearest neighbor (ANN) search. However, the efficiency of these methods is not guaranteed because the subspace representation step can be very time consuming when coping with high-dimensional data. Moreover, the subspace to point transforming process may cause subspace structural information loss which influences the search accuracy. In this paper, we present a new approach for hashing-based ANS search which can directly binarize a subspace without transforming it into a vector. The proposed method learns the binary codes for subspaces following a similarity preserving criterion, and simultaneously leverages the learned binary codes to train matrix classifiers as hash functions. Experiments on face and action recognition and video retrieval applications show that our method outperforms several state-of-the-art methods in both efficiency and accuracy. Moreover, we also compare our method with vector-based hashing methods. Results also show the superiority of our subspace matrix based search scheme.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.