Abstract

We present here a new class of activation functions for neural networks, which is called the CosGauss function. This function is a cosine-modulated Gaussian function. In contrast to the sigmoidal-, hyperbolic tangent- and Gaussian activation functions, more ridges can be obtained by the CosGauss function. It is proved that this function can be used to approximate polynomials and step functions. The CosGauss function was tested with a cascade-correlation-network on the sonar problem and results are compared with those obtained with other activation functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call