Abstract

Graph Neural Networks (GNNs) have emerged as a useful paradigm to process graph-structured data. Usually, GNNs are stacked to multiple layers and node representations in each layer are computed through propagating and aggregating the neighboring node features. To effectively train a GNN with multiple layers, normalization techniques are necessary. Though existing normalization techniques have achieved good results in helping GNNs training, but they seldom consider the structure information of the graph. In this paper, we propose two graph-aware normalization techniques, namely adjacency-wise normalization and graph-wise normalization, which fully take into account the structure information of the graph. Furthermore, we propose a novel approach, termed Attentive Graph Normalization (AGN), which learns a weighted combination of multiple graph-aware normalization methods, aiming to automatically select the optimal combination of multiple normalization methods for a specific task. We conduct extensive experiments on eleven benchmark datasets, including three single-graph and eight multiple-graph datasets, and the experimental results provide a comprehensive evaluation on the effectiveness of our proposals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.