Abstract

Relation classification (RC) aims to detect the semantic relation between two annotated entities in a piece of sentence, serving as an essential task in automatic knowledge graph construction. Due to the emergence of new relations, there is a recent trend to train RC models in continual settings. To overcome the catastrophic forgetting problem in continual learning, existing research is devoted in a two-stage training paradigm, fast adaptation to novel relations, and memory replay for all historical relations. These memory-replay-based methods explore different techniques to mitigate the forgetting problem of continual RC (CRC) models during the memory replay stage. However, we find that the representation space undergoes distortion due to the incoming of fresh relations in the fast adaptation phase. To address this issue, we propose using a knowledge distillation strategy and designing a margin loss, aiming to maintain the stability of the RC model during adaptation to new relations. In addition, in the second stage, with a limited number of typical memory instances available, we introduce a self-contrastive learning objective to facilitate learning a balanced decision boundary for RC. Through training in two stages, our objective is to acquire a stable representation space to encode instances for CRC. We experimentally demonstrate the superiority of our model over competing methods in various settings, and the results suggest that our tailored designs can achieve better performance in CRC.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.