Abstract

The study of graph neural networks has revealed that they can unleash new applications in a variety of disciplines using such a basic process that we cannot imagine in the context of other deep learning designs. Many limitations limit their expressiveness, and researchers are working to overcome them to fully exploit the power of graph data. There are a number of publications that explore graph neural networks (GNNs) restrictions and bottlenecks, but the common thread that runs through them all is that they can all be traced back to message passing, which is the key technique we use to train our graph models. We outline the general GNN design pipeline in this study as well as discuss solutions to the over-smoothing problem, categorize the solutions, and identify open challenges for further research. Abbreviations: CGNN: Continuous Graph Neural Networks; CNN: Convolution NeuralNetwork; DeGNN: Decomposition Graph Neural Network; DGN: Directional GraphNetworks; DGN: Differentiable Group Normalization; DL: Deep Learning; EGAI:Enhancing GNNs by a High-quality Aggregation of Beneficial Information; GAT: GraphAttention Network; GCN: Graph Convolutional Network; GDC: Graph Drop Connect; GDR: Group Distance Ratio; GNN: Graph Neural Network; GRAND: GraphRandom Neural Networks; IIG: Instance Information Gain; MAD: Man AverageDistance; PDE-GCN: Partial Differential Equations-GCN; PTDNet: ParameterizedTopological Denoising network; TDGNN: Tree Decomposition Graph NeuralNetwork;

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call