The task of relation extraction in natural language processing is to identify the relation between two specified entities in a sentence. However, the existing model methods do not fully utilize the word feature information and pay little attention to the influence degree of the relative relation extraction results of each word. In order to address the aforementioned issues, we propose a relation extraction method based on self-attention mechanism (SPCNN-VAE) to solve the above problems. First, we use a multi-head self-attention mechanism to process word vectors and generate sentence feature vector representations, which can be used to extract semantic dependencies between words in sentences. Then, we introduce the word position to combine the sentence feature representation with the position feature representation of words to form the input representation of piecewise convolutional neural network (PCNN). Furthermore, to identify the word feature information that is most useful for relation extraction, an attention-based pooling operation is employed to capture key convolutional features and classify the feature vectors. Finally, regularization is performed by a variational autoencoder (VAE) to enhance the encoding ability of model word information features. The performance analysis is performed on SemEval 2010 task 8, and the experimental results show that the proposed relation extraction model is effective and outperforms some competitive baselines.
Read full abstract