Abstract

Graph convolutional neural netwoks (GCNNs) have been emerged to handle graph-structured data in recent years. Most existing GCNNs are either spatial approaches working on neighborhood of each node, or spectral approaches based on graph Laplacian. Compared with spatial-based GCNNs, spectral-based GCNNs are capable of highly exploiting graph structure information, but always regard graphs undiredcted. Actually, there are many scenarios where the graph structures are directed, such as social networks, citation networks, etc. Treating graphs undirected may lose important information, which is helpful for graph learning tasks. This motivate us to construct a spectral-based GCNN for directed graphs. In this paper, we propose a scalable graph convolutional neural network with fast localized convolution operators derived from directed graph Laplacian, which is called fast directed graph convolutional network (FDGCN). FDGCN can directly work on directed graphs and can scale to large graphs as the convolution operation is linear with the number of edges. Furthermore, we find that FDGCN can unify the graph convolutional network (GCN), which is a classic spectral-based GCNN. The mechanism of FDGCN is thoroughly analyzed from spatial aggregation point of view. Since previous work has confirmed that considering uncertainty of graph could promote GCN a lot, the proposed FDGCN is further enhanced through extra training epochs on random graphs generated by mixed membership stochastic block model (MMSBM). Experiments are conducted for semi-supervised node classification tasks to evaluate the performance of FDGCN. Results show that our model can outperform or match state-of-the-art models in most cases.

Highlights

  • Deep learning models like convolutional neural networks (CNNs) have shown great success in tackling learning problems such as image classification [1], object detection [2], image inpainting [3], etc

  • In this paper we propose a scalable graph convolutional network with fast localized spectral filter named fast directed graph convolutional network (FDGCN), which is targeted for directed graphs

  • A fast localized convolution operator in FDGCN is constructed by integrating 1st-order Chebyshev polynomials approximation of spectral filter, together with the approximation by fixing Perron vector

Read more

Summary

INTRODUCTION

Deep learning models like convolutional neural networks (CNNs) have shown great success in tackling learning problems such as image classification [1], object detection [2], image inpainting [3], etc. Adaptive receptive paths is considered in [16] and cluster technique is used in [17] Another branch is spectralbased GCNN models, which uses spectral filters based on graph Laplacian to construct network models. Works [30] and [31] try to generalize existing graph convolution networks into a unified framework The former uses a message passing mechanism and the latter uses pseudocoordinates one to give the explanation. Spatial-based models can be generalized to new graphs as they perform convolution locally on each node, they cannot fully integrate graph structure information. Initial attempts have been taken to construct spectral filter with directed graph Laplacian, where the corresponding model is named DGCN [33]. The directed graph convolution operator is firstly derived by 1st-order Chebyshev polynomials approximation for spectral filter of directed graph Laplacian, which makes FDGCN 1-hop localized.

PRELIMINARIES
SPECTRAL-BASED GCNNS
THE LAPLACIAN FOR A DIRECTED GRAPH
INSPIRATION
FDGCN MODEL FOR SEMI-SUPERVISED NODE CLASSIFICATION
FDGCN MODEL WITH MMSBM
SPATIAL AGGREGATION POINT OF VIEW
EXPERIMENTS
ADDITIONAL EXPERIMENT
Findings
CONCLUSIONS

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.