Abstract

The domain of natural language processing has lately achieved exceptional breakthroughs especially after the origination of the deep neural networks. This has enabled the machine learning engineers to develop such deep models that are capable of performing high-level automation, empowering computer systems to interact with the humans in a competent manner. With the usage of special types of deep neural networks known as recurrent neural networks, it is possible to accomplish various applications in the domain of natural language processing including sentiment analysis, part-of-speech tagging, machine translation, and even text generation. This paper presents a deep, stacked long short-term memory network, an advanced form of recurrent neural network model which can generate text from a random input seed. This paper discusses the shortcomings of a conventional recurrent neural network hence bringing forward the concept of long short-term memory networks along with its architecture and methodologies being adopted.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call