Abstract

A great deal of operational information exists in the form of text. Therefore, extracting operational information from unstructured military text is of great significance for assisting command decision making and operations. Military relation extraction is one of the main tasks of military information extraction, which aims at identifying the relation between two named entities from unstructured military texts. However, the traditional methods of extracting military relations cannot easily resolve problems such as inadequate manual features and inaccurate Chinese word segmentation in military fields, failing to make full use of symmetrical entity relations in military texts. With our approach, based on the pre-trained language model, we present a Chinese military relation extraction method, which combines the bi-directional gate recurrent unit (BiGRU) and multi-head attention mechanism (MHATT). More specifically, the conceptual foundation of our method lies in constructing an embedding layer and combining word embedding with position embedding, based on the pre-trained language model; the output vectors of BiGRU neural networks are symmetrically spliced to learn the semantic features of context, and they fuse the multi-head attention mechanism to improve the ability of expressing semantic information. On the military text corpus that we have built, we conduct extensive experiments. We demonstrate the superiority of our method over the traditional non-attention model, attention model, and improved attention model, and the comprehensive evaluation value F1-score of the model is improved by about 4%.

Highlights

  • With the progress in science and technology, and the evolution of war patterns, operational information and intelligence data have exponentially increased

  • As one of the basic tasks in military information extraction technology, military relation extraction is a key approach to creating military knowledge bases and a military knowledge graph [1]

  • Bi-directional encoder representations from transformers (BERT) are one such pre-trained language model proposed by Google in 2018 [28]

Read more

Summary

Introduction

With the progress in science and technology, and the evolution of war patterns, operational information and intelligence data have exponentially increased. As one of the basic tasks in military information extraction technology, military relation extraction is a key approach to creating military knowledge bases and a military knowledge graph [1] This approach facilitates improvements in the quality of operational information services, assisting commanders in decision making. The effect of relation extraction is significantly improved with in-depth application of the deep neural network model. Toembedding address the above issues, we designbased a feature representation method thatBiGRU comword with position embedding, on the pre-trained model, using bines wordand embedding with position on the model, of using networks the multi-head attentionembedding, mechanismbased to capture thepre-trained semantic features miliBiGRU networks and the attention to capture the semantic featary text, and achieve the multi-head effective extraction ofmechanism military relations. Certain scale corpus of military relations via analyzing the semantic of military (3) We establish the types and tagging methods of military relations,features and construct a texts. scale corpus of military relations via analyzing the semantic features of milicertain tary texts

Related Works
Military
Word Embedding
Embedding
Position Embedding
Relative
BiGRU Layer
Multi-Head
Dataset
Evaluation Criterion
Parameters Setting
Comparison of Result on Different Embedding Methods
Comparison of Result on Different Feature Extraction Models
Comparison of Result on Different Training Data Sizes
Comparison of Result on Different Sentence Length
Result
Conclusions and Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call