Abstract
In order to study the optimization algorithm of convolutional neural networks, this paper combines the traditional stochastic gradient descent method with momentum and fractional order optimization, and deduces the momentum based fractional order stochastic gradient descent (MFSGD) algorithm for the full connection layer and the convolution layer in convolutional neural networks, respectively. The recognition accuracy and training loss for MFSGD with different orders are compared. The results show that the MFSGD is better than the fractional order gradient descend method, and there is an optimal order with a best recognition accuracy and training loss.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have