Abstract

Introduction: The purpose of the research is to compare different types of recurrent neural network architectures, namely the long short-term memory and gate recurrent node architecture and the convolutional neural network, and to explore their performance on the example of binary text classification. Material and Methods: To achieve this, the research evaluates the performance of these two popular deep-learning approaches on a dataset consisting of film reviews that are marked with both positive and adverse opinions. The real-world dataset was used to train neural network models using software implementations. Results and Discussion: The research focuses on the implementation of a recurrent neural network for the binary classification of a dataset and explores different types of architecture, approaches and hyperparameters to determine the best model to achieve optimal performance. The software implementation allowed evaluating of various quality metrics, which allowed comparing the performance of the proposed approaches. In addition, the research explores various hyperparameters such as learning rate, packet sizes, and regulation methods to determine their impact on model performance. Conclusion: In general, the research provides valuable insights into the performance of neural networks in binary text classification and highlights the importance of careful architecture selection and hyperparameter tuning to achieve optimal performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call