Abstract

Graph representation learning has been widely applied to graph tasks such as node classification, link prediction, graph-level classification and so on. Graph representation learning takes full advantage of spatial or spectral approaches to embed nodes into low-dimension space correctly and effectively. Especially, Graph Neural Networks (GNNs), which is one of representation learning, attract increasing interests due to their powerful ability to integrate local information and outstanding generalization in graph tasks. Currently, GNN models are also able to capture further nodes information with more stacked convolution layers. But when the number of layers reaches a certain level, these existing methods perform worse as the depth of networks increases continuously. Thus, most GNNs employ shallow architectures, leading to the failure of capturing further information. In this paper, we present Multi-hop Hierarchical Graph Neural Networks (MHGNNs), a new graph neural network framework, to address the shortcomings of lacking further node information and obtain broad receptive field. Distinct from prior works, one layer of MHGNNs can concatenate hop-level features in a hierarchical way, where features in the same hop are aggregated with each other. Another advantage is that MHGNNs have a flexible structure, where the number of used hops can be different in each layer. Besides, MHGNNs also use attention mechanism during the integrated step, which mines latent relationships among hops and adaptively selectes important hop-level features. Finally, our MHGNN model was evaluated on citation and protein-protein interaction graph benchmarks to conduct node classification, and has advanced or matched the outcomes of state-of-the-art methods in node classification tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.