Abstract

In the current paper, a new activation function is proposed for usage in constructing sigmoidal feedforward artificial neural networks. The suitability of the proposed activation function is established. The proposed activation function has a skewed derivative whereas the usually utilized activation functions derivatives are symmetric about the y-axis (as for the log-sigmoid or the hyperbolic tangent function). The efficiency and efficacy of the usage of the proposed activation function is demonstrated on six function approximation tasks. The obtained results indicate that if a network using the proposed activation function in the hidden layer, is trained then it converges to deeper minima of the error functional, generalizes better and converges faster as compared to networks using the standard log-sigmoidal activation function at the hidden layer.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call