Abstract

Deep graph clustering, efficiently dividing nodes into multiple disjoint clusters in an unsupervised manner, has become a crucial tool for analyzing ubiquitous graph data. Existing methods have acquired impressive clustering effects by optimizing the clustering network under the parametric condition—predefining the true number of clusters ( K tr ). However, K tr is inaccessible in pure unsupervised scenarios, in which existing methods are incapable of inferring the number of clusters ( K ), causing limited feasibility. This article proposes the first Parameter-Agnostic Deep Graph Clustering method (PADGC), which consists of two core modules: K -guidence clustering and topological-hierarchical inference, to infer K efficiently and gain impressive clustering predictions. Specifically, K -guidence clustering is employed to optimize the cluster assignments and discriminative embeddings in a mutual promotion manner under the latest updated K , even though K may deviate from K tr . In turn, such optimized cluster assignments are utilized to explore more accurate K in the topological-hierarchical inference, which can split the dispersive clusters and merge the coupled ones. In this way, these two modules are complementarily optimized until generating the final convergent K and discriminative cluster assignments. Extensive experiments on several benchmarks, including graphs and images, can demonstrate the superiority of our method. The mean values of our inferred K , in 11 out of 12 datasets, deviates from K tr by less than 1. Our method can also achieve competitive clustering effects with existing parametric deep graph clustering.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call