Abstract
Few-Shot Relation Classification (FSRC) aims to predict novel relationships by learning from limited samples. Graph Neural Network (GNN) approaches for FSRC constructs data as graphs, effectively capturing sample features through graph representation learning. However, they often face several challenges: 1) They tend to neglect the interactions between samples from different support sets and overlook the implicit noise in labels, leading to sub-optimal sample feature generation. 2) They struggle to deeply mine the diverse semantic information present in FSRC data. 3) Over-smoothing and overfitting limit the model's depth and adversely affect overall performance. To address these issues, we propose a Sample Representation Enhancement model based on Heterogeneous Graph Neural Network (SRE-HGNN) for FSRC. This method leverages inter-sample and inter-class associations (i.e., label mutual attention) to effectively fuse features and generate more expressive sample representations. Edge-heterogeneous GNNs are employed to enhance sample features by capturing heterogeneous information of varying depths through different edge attentions. Additionally, we introduce an attention-based neighbor node culling method, enabling the model to stack higher levels and extract deeper inter-sample associations, thereby improving performance. Finally, experiments are conducted for the FSRC task, and SRE-HGNN achieves an average accuracy improvement of 1.84% and 1.02% across two public datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.