Abstract

Graph neural networks (GNNs) are powerful models for node classification tasks. When the nodes exhibit a skew distribution, the imbalanced graph data and the message-passing way of GNNs may significantly influence the decision boundary. Existing methods, generating additional nodes and edges or assigning weights of nodes to balance the training scenario, generally have a complicated implementation and a high time complexity. To this end, we propose Balanced Neighbor Exploration (BNE), an algorithm to improve training efficiency and classification performance in a fundamentally different way without generating nodes and computing weights. Intuitively, nodes highly influenced by nodes in one class should have a higher probability of belonging to the class. Inspired by this intuition, BNE incorporates more nodes highly influenced by minority nodes into the minority set to provide a balanced training scenario. The neighbor exploration involves sum and sort operations whose time complexities are O(N) and O(Nlog⁡N), respectively, with a fast operation process. Moreover, BNE conducts the message-passing and neighbor exploration processes simultaneously, with a straightforward implementation. Experiments on real-world imbalanced graph data demonstrate that BNE vastly outperforms the state-of-the-art methods for semi-supervised node classification on imbalanced graph data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.