Abstract

Activation functions play a key role in neural networks, as they significantly affect the training process and the network’s performance. Based on the solution of a certain ordinary differential equation of the Riccati type, this work proposes an alternative generalized adaptive solution to the fixed sigmoid, which is called “generalized Riccati activation” (GRA). The proposed GRA function was employed on the output layer of an artificial neural network with a single hidden layer that consisted of eight neurons. The performance of the neural network was evaluated on a binary and a multiclass classification problem using different combinations of activation functions in the input/output layers. The results demonstrated that the swish/GRA combination yields higher accuracy than any other combination of activation functions. This benefit in terms of accuracy could be critical for certain domains, such as healthcare and smart grids, where AI-assisted decisions are becoming essential.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call