Abstract

Graph meta-learning has recently received significantly increased attention by virtue of its potential to extract common and transferable knowledge from learning different tasks on a graph. Existing methods for graph meta-learning usually leverage local subgraphs to transfer subgraph-specific information. However, they inherently face the challenge of imbalanced subgraphs due to inconsistent node density and different label distributions over local subgraphs. This paper proposes an adaptive graph meta-learning framework (AG-Meta) for learning the consistent and transferable representation of a graph in a way that can adapt to imbalanced subgraphs. Specifically, AG-Meta first learns the structural representation of subgraphs with various degrees using an Adaptive Graph Cascade Diffusion Network (AGCDN). AG-Meta then employs a prototype-consistency classifier to produce more accurate transferable inductive representations (also called prototypes) under few-shot settings with different label distributions of a subgraph. In the context of optimizing a model-agnostic meta-learner, a novel metric loss is finally introduced to achieve structural representation and prototype consistency. Extensive experiments are conducted to compare AG-Meta against baselines on five real-world networks, which validates that AG-Meta outperforms the state-of-the-art approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call