Abstract

Continual relation extraction aims to learn new relations from a continuous stream of data while avoiding forgetting old relations. Existing methods typically use the BERT encoder to obtain semantic embeddings, ignoring the fact that the vector representations suffer from anisotropy and uneven distribution. Furthermore, the relation prototypes are usually computed by memory samples directly, resulting in the model being overly sensitive to memory samples. To solve these problems, we propose a new continual relation extraction method. Firstly, we modified the basic structure of the sample encoder to generate uniformly distributed semantic embeddings using the supervised SimCSE-BERT to obtain richer sample information. Secondly, we introduced static relation prototypes and dynamically adjust their proportion with dynamic relation prototypes to adapt to the feature space. Lastly, through experimental analysis on the widely used FewRel and TACRED datasets, the results demonstrate that the proposed method effectively enhances semantic embeddings and relation prototypes, resulting in a further alleviation of catastrophic forgetting in the model. The code will be soon released at https://github.com/SuyueW/SS-CRE.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.