Abstract
Convolution and recurrent neural network have obtained remarkable performance in natural language processing(NLP). Moreover, from the attention mechanism perspective convolution neural network(CNN) is applied less than recurrent neural network(RNN). Because RNN can learn long-term dependencies and gives better results than CNN. But CNN has its own advantage, can extract high-level features by using its local fix size context at the input level. Thus, this paper proposed a new model based on RNN with CNN-based attention mechanism by using the merits of both architectures together in one model. In the proposed model, first, CNN learns the high-level features of sentence from input representation. Second, we used attention mechanism to get the attention of the model on the features which contribute much in the prediction task by calculating the attention score from features context generated from CNN filters. Finally, these features context from CNN with attention score are commonly used at the RNN to process them sequentially. To validate the model we experiment on three benchmark datasets. Experiment results and their analysis demonstrate the effectiveness of the model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.