Abstract

Attention based Bidirectional LSTM is used in machine learning for Data Classification and predictions. The activation function is one of the important aspects which facilitate the deep training by introducing the non-linearity data into the learning & linear process. There are various types of existing activation functions like Sigmoid, tanh, Relu etc. To increase the performance in terms of accuracy the novel research work Mtanh-Attention-BiLSTM model for classification and prediction was proposed. This novel activation functions reduces the gradient problems and transform the nonlinear into linear form. The training and classification was done with the primary export dataset and standard car dataset also. The performance improvement was observed with increase in accuracy compared with the existing model.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.