Abstract

Graph neural networks (GNNs) have known an increasing success recently, with many GNN variants achieving state-of-the-art results on node and graph classification tasks. The proposed GNNs, however, often implement complex node and graph embedding schemes, which makes it challenging to explain their performance. In this paper, we investigate the link between a GNN’s expressiveness, that is, its ability to map different graphs to different representations, and its generalization performance in a graph classification setting. In particular, we propose a principled experimental procedure where we (i) define a practical measure for expressiveness, (ii) introduce an expressiveness-based loss function that we use to train a simple yet practical GNN that is permutation-invariant, (iii) illustrate our procedure on benchmark graph classification problems and on an original real-world application. Our results reveal that expressiveness alone does not guarantee a better performance, and that a powerful GNN should be able to produce graph representations that are well separated with respect to the class of the corresponding graphs.

Highlights

  • To illustrate our experimental framework, we introduce a simple yet practical architecture, the Simple Permutation-Invariant Graph Convolutional Network (SPI-graph convolutional networks (GCNs))

  • We show how the defined measure can be used to train Simple Permutation-Invariant Graph Convolutional Network (SPI-GCN) and help understand the impact expressiveness has on its generalization performance

  • We evaluate SPI-GCN on an original real-world data set collected at the ICMPE,4 HYDRIDES, that contains metal hydrides in graph format, labelled as stable or unstable according to specific energetic properties that determine their ability to store hydrogen efficiently

Read more

Summary

Introduction

Many real-world data present an inherent structure and can be modelled as sequences, graphs, or hypergraphs [2,5,9,15]. Graph-structured data, in particular, are very common in practice and are at the heart of this work. We consider the problem of graph classification. Given a set G = {Gi}m i=1 of arbitrary graphs and their respective labels {yi}m i=1, where yi ∈ {1, . C} and C is the number of classes, we aim at finding a mapping. Supported by the Emergence@INC-2018 program of the French National Center for Scientific Research (CNRS) and the DiagnoLearn ANR JCJC project. R. Berthold et al (Eds.): IDA 2020, LNCS 12080, pp.

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.