Abstract

Convolutional neural networks (CNN) and recurrent neural networks (RNN) are widely used in natural language processing. However, the natural language has a certain dependence on the structure. A single CNN model leads to ignore the semantic and grammatical information of words. Traditional RNN has gradient disappearance or gradient dispersion problem. For this reason, the paper designs the CNN-BLSTM network in the way of network level combination, introduces Attention mechanism, and proposes a CNN-BLSTM +Attention model. The fusion model effectively deals with the position invariance of local features and extracted efficient local feature information. Furthermore, Attention mechanism is introduced to automatically weigh the output sequence information at each time to reduce the loss of key features when RNNs are used to model the sequence features. The feature extraction in time and space is completed. The experimental results show that the accuracy of the proposed model is 3 to 4 percentage points higher than that of other models. When dealing with alarm text, the model not only guarantees the local correlation of data, but also strengthens the effective combination ability of sequence features.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.