Abstract

Compared with conventional close-domain relation classification, incremental few-shot relation classification requires incrementally to learn novel relations through very few samples without forgetting base relations, which is more fitting to the needs of real scenarios. Existing incremental few-shot learning methods commonly leverage memory to freeze knowledge of the learned base classes, and then incrementally learn new classes from very few labeled samples. However, there are still two challenges to these existing approaches: overfitting novel relations and the catastrophic forgetting of base relations. In this paper, we propose a relational concept enhanced prototypical network (RC-EProto) to address these problems. First, we build a concept alignment module to eliminate the difference between relational concepts and samples, and we incorporate relational concepts for sample representation by this module to construct a more appropriate prototype. Second, we develop a contrastive approach on the base relations to address the forgetting of old knowledge. The samples of the base relation are pulled tightly with its prototype in the embedding space, while the samples of the other relations are pulled far from this base prototype. Extensive experiments on FewRel 1.0 and 2.0 datasets demonstrate the superiority of our method compared with the state-of-the-art methods. Notably, RC-EProto maintains an excellent incremental few-shot domain adaptation learning ability. Compared to existing methods, our model is able to achieve near-optimal results for the base relations while improving the classification accuracy of the novel relations by 13.48% and 9.63% for the 1-increment and 5-increment settings on FewRel 2.0, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call