Abstract

Abstract: The use of chatbots has seen a significant rise in the past few years, these pieces of software are seen on various Ecommerce platforms or banking platforms. These applications mostly involve the use of Rule-based chatbots, in this paper we have explored the possibility of using smart chatbots which can have a free-flowing conversation with the user. The training of these chatbots determines their ability to have a human-like conversation. The transformer model in this study has an encoderdecoder architecture, with a self-attention layer. The self-attention layer helps the model to have a context of the input, which gives it a reference of each word with respect to the other words in the sentence to the model. The model is being trained for various epochs and these outputs have been based against expected output and their similarity is calculated using Cosine Similarity Technique. The results from these tests were observed and presented in the paper. Keywords: Chatbot, Transformer Model, Self -Attention Layer, Epochs, Cosine Similarity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call