Abstract

Optimizers in Convolutional Neural Networks play an important role in many advanced deep learning models. Studies on advanced optimizers and modifications of existing optimizers continue to hold significant importance in the study of machine tools and algorithms. There are a number of studies to defend and the selection of these optimizers illustrate some of the challenges on the effectiveness of these optimizers. Comprehensive analysis on the optimizers and alteration with famous activation function Rectified Linear Unit (ReLU) offered to protect effectiveness. Significance is determined based on the adjustment with the original Softmax and ReLU. Experiments were performed with Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad) and Stochastic Gradient Descent (SGD) to examine the performance of Convolutional Neural Networks for image classification using the Canadian Institute for Advanced Research dataset (CIFAR-10).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call