Abstract

Relation extraction is one of the most important intelligent information extraction technologies, which can be used to construct and optimize services in intelligent communication systems (ICS). One issue with the existing relation extraction approaches is that they use one-sided sentence embedding as their final prediction vector, which degrades relation extraction performance. The innovative relation extraction model REEGAT (RoBERTa Entity Embedding and Graph Attention networks enhanced sentence representation) that we present in this paper, incorporates the concept of enhanced word embedding from graph neural networks. The model first uses RoBERTa to obtain word embedding and PyTorch embedding to obtain relation embedding. Then, the multi-headed attention mechanism in GAT (graph attention network) is introduced to weight the word embedding and relation embedding to enrich further the meaning conveyed by the word embedding. Finally, the entity embedding component is used to obtain sentence representation by pooling the word embedding from GAT and the entity embedding from named entity recognition. The weighted and pooled word embedding contains more relational information to alleviate the one-sided problem of sentence representation. The experimental findings demonstrate that our model outperforms other standard methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call