Abstract

In recent years, Convolutional Neural Networks (CNN) perform very well in many complex tasks. When we train CNN, the Stochastic Gradient Descent (SGD) algorithm is widely used to optimize the loss function of CNN. However, SGD algorithm has some disadvantages such as being easy to fall into local optimum and vanishing gradient problems that need to be solved. In this paper, we propose a new hybrid algorithm that aims to tackle the disadvantage mentioned above by combining the advantages of the Lclose Particle Swarm Optimization (LPSO) and SGD algorithm. Particle Swarm Optimization (PSO) is a Global optimization algorithm, but it does not perform very well in optimizing the loss function of the neural network because of the neural network's high dimensional weight parameters and the infinite search area. To take advantage of the excellent global search capability of LPSO and the rapid convergence capability, we design the LPSO-SGD algorithm. In the experimental part, we construct the LeNet-5 deep CNN to classify the MNIST data set and the experimental results demonstrate that the proposed algorithm perform better than standard SGD algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call