Abstract

A significant increase in computer performance, the accumulation of a large amount of data necessary for training deep neural networks, the development of training methods for neural networks that allow you to quickly and efficiently train networks consisting of a hundred or more layers, has led to significant progress in training deep neural networks. This allowed deep neural networks to take a leading position among machine learning methods. In this work, neural network paradigms (and their methods of training and functioning) considers, such as Rosenblatt perceptron, multilayer perceptrons, radial basis function network, Kohonen network, Hopfield network, Boltzmann machine, and deep neural networks. As a result of comparative consideration of these paradigms, it can be concluded that they all successfully solve the tasks set before them, but now, deep neural networks are the most effective mechanism for solving intellectual practical tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call