Abstract

Contrastive learning is widely used in deep image clustering due to its ability to learn discriminative representations. However, some studies simply combined contrastive learning with clustering. This line of works often ignores semantic meaningful representations and leads to suboptimal performance. In this paper, we propose a new deep image clustering framework called Nearest Neighbor Contrastive Clustering (NNCC), which fuses contrastive learning with neighbor relation mining. During training, contrastive learning and neighbor relation mining are updated alternately, where the former is conducted in the backward pass, while the latter is employed in the forward pass. Specially, we empirically find that data augmentation is an effective technique for generating nearest neighbors manually. A stronger data augmentation means more nearest neighbors involved for learning powerful discriminative representations in the contrastive learning. Due to effective neighbor relation mining, the proposed framework learns more semantic meaningful representations with contrastive learning and obtains more accurate image clusters. Through experimental results on six image datasets, the proposed framework defeats compared state-of-the-arts clustering methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.