Abstract

Due to insufficient supervision and the gap between pre-training pretext tasks and downstream tasks, transferring pre-trained graph neural networks (GNNs) to downstream tasks in low-resource scenarios remains challenging. In this paper, a Contrastive Fine-tuning (Con-tuning) framework is proposed for low-resource graph-level transfer learning, and a graph-level supervised contrastive learning (SCL) task is designed within the framework as the first attempt to introduce SCL for fine-tuning processes of pre-trained GNNs. The SCL task compensates for the insufficient supervision problem in low-resource scenarios and narrows the gap between pretext tasks and downstream tasks. To further reinforce the supervision signal in the SCL task, we devise a graphon theory based labeled graph generator to extract the generalized knowledge of a specific class of graphs. Based on this knowledge, graph-level templates are generated for each class and used as contrastive samples in the SCL task. Then, the proposed Con-tuning framework jointly learns the SCL task and downstream tasks to effectively fine-tune the pre-trained GNNs for downstream tasks. Extensive experiments with eight real-world datasets show that Con-tuning framework enables pre-trained GNNs to achieve better performance on graph-level downstream tasks in low-resource settings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call