Abstract

Recently meta-learning-based few-shot learning methods have been widely used for relation classification. Previous work reveals that meta-learning performs poorly in scenarios where the edge probability distribution of the target domain dataset appears to be significantly different from the source domain. In this paper, we enhance the meta-learning framework with high-dimensional semantic feature extraction and hyperplane projection metrics for meta-tasks. First, we enhance the focus of BERT on entity words by adding entity markers and vector pooling. After that, the high-dimensional semantic features of the support set are extracted and transformed into hyperplanes. Finally, we obtain the classification results by calculating the projection distance between the query sample and the hyperplane. In addition, we design a auxiliary function with a plane correction factor, which can better amplify the plane spacing and reduce the degree of category confusion, which is important for solving the problem of metric spatial loss. Experiments on two real-world few-shot datasets show that our model HPN is more effective in classifying few-shot relations in the same domain and domain-adapted scenarios. And HPN is more stable on NOTA tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call