Abstract

As an indispensable branch of unsupervised learning, deep clustering is rapidly emerging along with the growth of deep neural networks. Recently, contrastive learning paradigm has been combined with deep clustering to achieve more competitive performance. However, previous works mostly employ random augmentations to construct sample pairs for contrastive clustering. Different augmentations of a sample are treated as positive sample pairs, which may result in false positives and ignore the semantic variations of different samples. To address these limitations, we present a novel end-to-end contrastive clustering framework termed Contrastive Clustering with Effective Sample pairs construction (CCES), which obtains more semantic information by jointly leveraging an effective data augmentation method ContrastiveCrop and constructing positive sample pairs based on nearest-neighbor mining. Specifically, we augment original samples by adopting ContrastiveCrop, which explicitly reduces false positives and enlarges the variance of samples. Further, with the extracted feature representations, we provide a strategy to construct positive sample pairs via a sample and its nearest neighbor for instance-wise and cluster-wise contrastive learning. Experimental results on four challenging datasets demonstrate the effectiveness of CCES for clustering, which surpasses the state-of-the-art deep clustering methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call