Abstract
The prerequisite for victory in war is the rapid and accurate identification of the tactical intention of the target on the battlefield. The efficiency of manual recognition of the combat intention of air targets is becoming less and less effective with the advent of information warfare. Moreover, if the traditional method of combat intention of air targets is based only on data from a single moment in time, the characteristic information on the time-series data is difficult to capture effectively. In this context, we design a new deep learning method attention mechanism with temporal convolutional network and bidirectional gated recurrent unit (Attention-TCN-BiGRU) to improve the recognition of the combat intent of air targets. Specifically, suitable characteristics are selected based on the combat mission and air posture to construct a characteristic set of air target intentions and encode them into temporal characteristics. Each characteristic in the characteristic set is given an appropriate weight through the attention mechanism. In addition, temporal convolutional network (TCN) is used to mine the data for latent characteristics and bidirectional gated recurrent unit (BiGRU) is used to capture long-term dependencies in the data. Experiments comparing with other methods and ablation demonstrate that Attention-TCN-BiGRU outperforms state-of-the-art methods in terms of accuracy in recognizing target intent in the air.
Highlights
With the development of military technology and aviation technology, informationization has gradually become the core of the modern battlefield, and future wars will be informationized
In response to the drawbacks of the above methods, Liu et al [16] established an air combat target intention prediction model based on long short-term memory (LSTM) network with incomplete information, introduced the cubic spline interpolation function fitting and the mean padding to repair the incomplete data, and used the adaptive moment estimation (Adam) optimization algorithm to accelerate the training speed of the target intention prediction model, so as to effectively prevent the local optimum problem
Xue et al [17] designed panoramic convolutional long short-term memory neural network (PCLSTM), a new deep learning method, to improve the ability of intention recognition, and designed a time series pooling layer to reduce the parameters of the neural network
Summary
With the development of military technology and aviation technology, informationization has gradually become the core of the modern battlefield, and future wars will be informationized. The dynamic attributes and battlefield environment of the target will show the characteristics of changing with time, and the enemy target has certain concealment and deception when performing combat operations As a result, it is not sound enough for the above-mentioned deep learning method to judge the combat intention of the enemy target by using the characteristic information at a single time. The above two methods based on LSTM network are verified by experiments to have a certain effect on air target tactical intention recognition. In view of the above problems, we propose an air target tactical intention recognition model based on Attention-TCN-BiGRU. The experimental results show that the Attention-TCN-BiGRU model is superior to other methods in the accuracy of intention recognition and has theoretical significance and reference value for auxiliary combat systems. Mathematics 2021, 9, 2412 intention of air targets; Section 3 describes in detail the model proposed in this p3aopfe2r1; Section 4 gives the experimental results and analysis; Section 5 concludes the paper
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.