Abstract

To avoid the overfitting phenomenon that appeared in performing graph neural networks (GNNs) on test examples, the feature encoding scheme of GNNs usually introduces the dropout procedure. However, after learning latent node representations under this scheme, Gaussian noise produced by the dropout operation is inevitably transmitted into the next neighborhood aggregation step, which necessarily hampers the unbiased aggregation ability of GNN models. To address this issue, in this article, we present a novel aggregator, denoising aggregation (DNAG), which utilizes principal component analysis (PCA) to preserve the aggregated real signals from neighboring features and simultaneously filter out the Gaussian noise. The idea is different from using PCA on traditional applications to reduce the feature dimension. We regard PCA as an aggregator to compress the neighboring node features to have better expressive denoising power. We propose new training architectures to simplify the intensive computation of PCA in DNAG. Numerical experiments show the apparent superiority of the proposed DNAG models in gaining more denoising capability and achieving the state of the art for a set of predictive tasks on several graph-structured datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call