Abstract

We increase the power of the Artificial Neural Networks with the help of the Activation Function (AF). The tansig and logsig are widely used AF. But there is still requires some improvement in the AF. So, in this paper, we have proposed a NewSigmoid AF in the neural network. NewSigmoid is also as powerful as tansig and logsig. In multiple cases, the NewSigmoid function gives a better or equivalent performance as compared with both these AF. Like these AF, NewSigmoid is also a smooth S-shape, bounded, continuously differentiable, and zero-centered function. Therefore the NewSigmoid is also suitable for solving non-linear problems. We have tested this AF on iris, cancer, glass, chemical, bodyfat, wine, and ovarian datasets. We use Scaled Conjugate Gradient (SCG), Levenberg-Marquardt (LM), and Bayesian Regularization (BR) algorithms during the optimization of the neural network. Maximum 100% accuracy in the iris dataset while using LM, and BR; 99.9% accuracy in the cancer dataset using BR; 100% accuracy in the glass dataset using BR; 100% accuracy in the chemical and bodyfat dataset using SCG, LM, and BR; 100% accuracy in the wine dataset using LM, and BR; and 99.1% accuracy in the ovarian dataset using BR has been found while working with multilayer neural networks. The NewSigmoid also achieves 100% training and validation accuracy on the mathework-cap image dataset using SCG.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call