Abstract
In recent years, the emerging hashing techniques have been successful in large-scale image retrieval. Due to its strong learning ability, deep hashing has become one of the most promising solutions and achieved good results in practice. However, existing deep hashing methods had some limitations, for example, most methods consider only one kind of supervised loss, which leads to insufficient utilization of supervised information. To address this issue, we proposed a Triplet Deep Hashing method with Joint supervised Loss based on convolution neural network (JLTDH) in this work. The proposed JLTDH method combine triplet likelihood loss and linear classification loss, moreover, the triplet supervised label is adopted, which contains richer supervised information than that of pointwise and pairwise label. At the same time, in order to overcome the cubic increase in the number of triplets and make triplet training more effective, we adopt a novel triplet selection method. The whole process is divided into two stages, in the first stage, taking the triplets generated by the triplet selection method as the input of CNN, the three CNNs with shared weights are used for image feature learning, the last layer of the network outputs a preliminary hash code. In the second stage, relying on the hash code of the first stage and the joint loss function, the neural network model is further optimized so that the generated hash code has higher query precision. We perform extensive experiments on three public benchmark datasets CIFAR-10, NUS-WIDE, and MS-COCO. Experimental results demonstrate that the proposed method outperforms the compared methods, the method is also superior to all previous deep hashing methods based on triplet label.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.