Abstract

ABSTRACTNumber of attempts have been made in the past to improve the generalization performance of artificial neural networks (ANN). Internal adjustment in ANN deals with tuning of various parameters like learning rate, activation function, etc. Majority of activation functions that exist in the literature are transcendental in nature. In this paper, a novel parametric algebraic activation (PAA) function has been proposed. The function PAA is a generalized function of which Elliott activation function is a special case. PAA is a family of S-shaped curves and satisfies all the important properties of activation functions. This activation is employed on resilient propagation algorithm (RPROP) learning algorithm. Comparative performance evaluation with the widely known activation functions has been carried out on various benchmark datasets taken from University of California Irvine (UCI) machine learning repository. Comparative performance evaluation in terms of number of epochs and testing error of the proposed PAA has been made with the standard activation functions. It has been observed that decrease in number of epochs and testing error for the proposed PAA in RPROP is highly statistically significant for most of the UCI datasets when compared with standard activation functions. Thus incorporating PAA in RPROP can make it more powerful for classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call