Few-Shot Relation Classification (FSRC) aims to predict novel relationships by learning from limited samples. Graph Neural Network (GNN) approaches for FSRC constructs data as graphs, effectively capturing sample features through graph representation learning. However, they often face several challenges: 1) They tend to neglect the interactions between samples from different support sets and overlook the implicit noise in labels, leading to sub-optimal sample feature generation. 2) They struggle to deeply mine the diverse semantic information present in FSRC data. 3) Over-smoothing and overfitting limit the model's depth and adversely affect overall performance. To address these issues, we propose a Sample Representation Enhancement model based on Heterogeneous Graph Neural Network (SRE-HGNN) for FSRC. This method leverages inter-sample and inter-class associations (i.e., label mutual attention) to effectively fuse features and generate more expressive sample representations. Edge-heterogeneous GNNs are employed to enhance sample features by capturing heterogeneous information of varying depths through different edge attentions. Additionally, we introduce an attention-based neighbor node culling method, enabling the model to stack higher levels and extract deeper inter-sample associations, thereby improving performance. Finally, experiments are conducted for the FSRC task, and SRE-HGNN achieves an average accuracy improvement of 1.84% and 1.02% across two public datasets.
Read full abstract