Abstract
AbstractEntity relationship extraction is the main task in information extraction, and its purpose is to extract triples <entity e1, relationship r, entity e2> from unstructured text. The current relationship extraction model is mainly based on the BiLSTM neural network, and most of the introduced are sentence-level attention mechanisms. The structural parameters of this model are complex, which easily leads to over-fitting problems, and lacks the acquisition of word-level information within the sentence. In response to these problems, we propose a model based on the multi-attention mechanism and BiGRU network. The model mainly uses BiGRU as the main coding structure. By reducing the parameter settings, the training efficiency can be effectively improved. At the same time, a multi-attention mechanism is introduced to learn the influence of different features on relationship classification from the two dimensions of word level and sentence level, and to improve the effect of relationship extraction through different weight settings. The model is tested on the SemVal 2010 task8 dataset. The experiment shows that our model is significantly better than the baseline method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.