Abstract

Graph neural networks (GNNs) have received more and more attention in past several years, due to the wide applications of graphs and networks, and the superiority of their performance compared to traditional heuristics-driven approaches. However, most existing GNNs still focus on node-level applications, such as node classification and link prediction, and many challenging graph tasks are graph-level, such as graph search. In this talk, I will introduce our recent progress on graph-level neural operator development. In particular, we will examine three challenging tasks that are key to the success of graph search: (1) How can we conduct efficient graph similarity search by turning the NP-Complete problems, such as the Graph Edit Distance (GED) and Maximum Common Subgraph (MCS) computation, into a learning problem? We will present SimGNN [1] and GraphSim [3] that are able to provide more efficient and effective results compared to state-of-the-art approximate algorithms. (2) How can we provide a neural operator that can turn any graph into a low dimensional representation vector, which is learnable, inductive, and unsupervised? In this line, we propose UGraphEmd [2] that is able to leverage graph-graph interaction to produce manifold-preserving graph-level embedding. Moreover, GHashing [5] is designed to map each graph to a discrete hash code, which enables a much more efficient search (20 times speed up) to handle large graph database with millions of graphs. And (3) how can we design GNNs that can directly detect the best matched subgraphs of two graphs? A deep reinforcement learning framework RLMCS [4] is then proposed to address this issue, with the goal to learn the best strategy to pick the next matching pair for two graphs. In the end, we will provide some discussions to the open questions in the field.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call