Abstract

This research focuses on semi-supervised classification tasks, specifically for graph-structured data under data-scarce situations. It is known that the performance of conventional supervised graph convolutional models is mediocre at classification tasks, when only a small fraction of the labeled nodes are given. Additionally, most existing graph neural network models often ignore the noise in graph generation and consider all the relations between objects as genuine ground-truth. Hence, the missing edges may not be considered, while other spurious edges are included. Addressing those challenges, we propose a Bayesian Graph Attention model which utilizes a generative model to randomly generate the observed graph. The method infers the joint posterior distribution of node labels and graph structure, by combining the Mixed-Membership Stochastic Block Model with the Graph Attention Model. We adopt a variety of approximation methods to estimate the Bayesian posterior distribution of the missing labels. The proposed method is comprehensively evaluated on three graph-based deep learning benchmark data sets. The experimental results demonstrate a competitive performance of our proposed model BGAT against the current state of the art models when there are few labels available (the highest improvement is 5%), for semi-supervised node classification tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.