Abstract

Deep graph clustering approaches employ deep graph neural networks to encode node embeddings and subsequently partition nodes based on these representations. Recent one-step methods have demonstrated progress; however, they often encounter issues during the representation learning phase. A primary challenge is that the encoding networks fail to capture the high-order structural information of the graph for effective clustering. We believe that combining the two distributions comprehensively characterizes the real node distribution in terms of neighborhood proximity and community aggregation to achieve better clustering. Additionally, the prevalent representation learning paradigms based on adjacency matrix reconstruction are computationally intensive and increase optimization challenges. This paper presents a novel deep graph clustering model, termed Deep Integration of Community structure and Neighborhood information (DICN). The model incorporates a graph self-attention mechanism to aggregate neighborhood-level node information, and utilizes a probabilistic generative model for edge community detection to captain community-level node information. These two-level embeddings are employed to compute explicit node membership distributions for the clustering loss. This ensures that the clustering objective effectively integrates both low-order and high-order node characteristics. The comprehensive learning process of DICN optimizes not only the node embeddings but also the clustering results by integrating the two-level losses from representation learning and the clustering loss with a weak guidance loss for clustering. Experiments on some benchmark datasets illustrate that DICN model has superior performance over existing state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.