Abstract

Recently, self-paced contrastive learning has emerged as a promising method for unsupervised object re-identification. These methods generate pseudo labels, store centroid features in the memory bank, and periodically update them. However, affected by the performance of the clustering method, within each cluster exists inevitably noisy instances, and self-paced contrastive learning usually requires a large number of negative samples from various classes, where false-negative samples give rise to the class collision issue. These lead to performing incorrect model optimization. In this paper, we propose a non-contrastive nearest neighbor identity-guided (NNNI) method to overcome these challenges. The advantage of NNNI is to provide the model with a highly accurate prior. Specifically, this method relies on the random identity sampler commonly used in re-identification tasks to provide the network with a regression target of the nearest neighbors of the same identity within a mini-batch. It encodes more and more information in an iterative process through a Siamese network with an exponential moving average to train high-quality representations. NNNI alleviates the negative effects of noise instances and corrects class collision issues during training. Extensive experiments show that our method is effective on unsupervised object re-identification and achieves state-of-the-art performance on three large-scale person re-identification datasets and one large-scale vehicle re-identification dataset, which is competitive with even supervised methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call