Abstract
AbstractIn this chapter, we go through the fundamentals of artificial neural networks and deep learning methods. We describe the inspiration for artificial neural networks and how the methods of deep learning are built. We define the activation function and its role in capturing nonlinear patterns in the input data. We explain the universal approximation theorem for understanding the power and limitation of these methods and describe the main topologies of artificial neural networks that play an important role in the successful implementation of these methods. We also describe loss functions (and their penalized versions) and give details about in which circumstances each of them should be used or preferred. In addition to the Ridge, Lasso, and Elastic Net regularization methods, we provide details of the dropout and the early stopping methods. Finally, we provide the backpropagation method and illustrate it with two simple artificial neural networks.
Highlights
The inspiration for artificial neural networks (ANN), or neural networks, resulted from the admiration for how the human brain computes complex processes, which is entirely different from the way conventional digital computers do this
It is estimated that the brain is composed of around 1011 neurons that work in parallel, since the processing done by the neurons and the memory captured by the synapses are distributed together over the network
The net input is evaluated in this function and we obtain the output of the network as shown : Fig. 10.3 General artificial neural network model
Summary
The inspiration for artificial neural networks (ANN), or neural networks, resulted from the admiration for how the human brain computes complex processes, which is entirely different from the way conventional digital computers do this. One of the characteristics of biological neurons, to which they owe their great capacity to process and perform highly complex tasks, is that they are highly connected to other neurons from which they receive stimuli from an event as it occurs, or hundreds of electrical signals with the information learned. When it reaches the body of the neuron, this information affects its behavior and can affect a neighboring neuron or muscle (Francisco-Caicedo and López-Sotelo 2009). Anderson et al (1990) were more expressive in this sense and pointed out that “ANN are statistics for amateurs since most neural networks conceal the statistics from the user.”
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have