Abstract

Graph convolutional networks (GCNs) are neural network frameworks for machine learning on graphs. They can simultaneously perform end-to-end learning on the attribute information and the structure information of graph data. However, most existing GCNs inevitably encounter the limitations of non-robustness and low classification accuracy when labeled nodes are scarce. To address the two issues, the deep graph convolutional generative adversarial network (DGCGAN), a model combining GCN and deep convolutional generative adversarial networks (DCGAN), is proposed in this paper. First, the graph data is mapped to a highly nonlinear space by using the topology and attribute information of the graph for symmetric normalized Laplacian transform. Then, through the feature-structured enhanced module, the node features are expanded into regular structured data, such as images and sequences, which are input to DGCGAN as positive samples, thus expanding the sample capacity. In addition, the feature-enhanced (FE) module is adopted to enhance the typicality and discriminability of node features, and to obtain richer and more representative features, which is helpful for facilitating accurate classification. Finally, additional constraints are added to the network model by introducing DCGAN, thus enhancing the robustness of the model. Through extensive empirical studies on several standard benchmarks, we find that DGCGAN outperforms state-of-the-art baselines on semi-supervised node classification and remote sensing image classification.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.