Abstract

In graph representation learning, Graph Convolutional Networks (GCNs) and their variants have received much attention. However, GCNs encounter oversmoothing as the models get deeper, limiting their ability to aggregate node representations within high-order neighborhood. Inspired by the modular structure of the brain network, we propose Modularity-based Siamese Simple Graph Convolution (MS-SGC), a Siamese network architecture that incorporates the characteristics of brain modular structure into graph convolutional networks. Spectral clustering is leveraged to detect the modular structure in the graph, and then the weight of cross edges between modules is lowered. Siamese network is adopted to combine the modularity-preserved graph representation with the original graph representation, improving classification performance and reducing oversmoothing. Furthermore, a graph convolution method that functions as a linear low-pass graph filter is elaborated through spectral analysis to preserve the similarity of nodes within the same module and further alleviate the oversmoothing problem. We validate the effectiveness of MS-SGC on citation networks and extend our experimental analysis to various downstream tasks. Extensive experiments demonstrate that MS-SGC outperforms state-of-the-arts while reducing the time and computation complexity for node classification. Moreover, MS-SGC achieves competitive performance compared to other state-of-the-arts in node clustering, community prediction and text classification tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call