Abstract

Hashing methods have attracted much attention due to their superior time and storage properties for image retrieval. To learn similarity-preserving hash function, most existing methods are designed for the centralized setting. However, the current data storage systems are distributed to increase scalability. Obviously, it is infeasible to aggregate all the data into a fusion center because of the prohibitively expensive communication and computation overhead. Motivated by this, some methods are proposed to achieve hashing for distributed data. However, these methods mostly focus on extending one specific hashing to a distributed model without considering the generality. In this letter, we propose a novel general distributed hash learning model, which can be viewed as an effective distributed model of most hashing methods. The proposed model can achieve up to 15.2% accuracy gains over state-of-the-art distributed hashing methods, while the communication cost is independent on the data size.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.