Abstract

Graph neural networks (GNNs) could directly deal with the data of graph structure. Current GNNs are confined to the spatial domain and learn real low-dimensional embeddings in graph classification tasks. In this article, we explore frequency domain-oriented complex GNNs in which the node's embedding in each layer is a complex vector. The difficulty lies in the design of graph pooling and we propose a mirror-connected design with two crucial problems: parameter reduction problem and complex gradient backpropagation problem. To deal with the former problem, we propose the notion of squared singular value pooling (SSVP) and prove that the representation power of SSVP followed by a fully connected layer with nonnegative weights is exactly equivalent to that of a mirror-connected layer. To resolve the latter problem, we provide an alternative feasible method to solve singular values of complex embeddings with a theoretical guarantee. Finally, we propose a mixture of pooling strategies in which first-order statistics information is employed to enrich the last low-dimensional representation. Experiments on benchmarks demonstrate the effectiveness of the complex GNNs with mirror-connected layers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call