Abstract

In this work an adaptive mechanism for choosing the activation function is proposed and described. Four bi-modal derivative sigmoidal adaptive activation function is used as the activation function at the hidden layer of a single hidden layer sigmoidal feedforward artificial neural networks. These four bi-modal derivative activation functions are grouped as asymmetric and anti-symmetric activation functions (in groups of two each). For the purpose of comparison, the logistic function (an asymmetric function) and the function obtained by subtracting 0.5 from it (an anti-symmetric) function is also used as activation function for the hidden layer nodes’. The resilient backpropagation algorithm with improved weight-tracking (iRprop+) is used to adapt the parameter of the activation functions and also the weights and/or biases of the sigmoidal feedforward artificial neural networks. The learning tasks used to demonstrate the efficacy and efficiency of the proposed mechanism are 10 function approximation tasks and four real benchmark problems taken from the UCI machine learning repository. The obtained results demonstrate that both for asymmetric as well as anti-symmetric activation usage, the proposed/used adaptive activation functions are demonstratively as good as if not better than the sigmoidal function without any adaptive parameter when used as activation function of the hidden layer nodes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call