Abstract

The performance of feed-forward neural networks can be substantially impaired by the ill-conditioning of the corresponding Jacobian matrix. Ill-conditioning appearing in feed-forward learning process is related to the properties of the activation function used. It will be shown that the performance of the network training can be improved using an adaptive activation function with a properly updated gain parameter during the learning process. The efficiency of the proposed adaptive procedure is examined in structural optimization problems where a trained neural network is used to replace the structural analysis phase and capture the necessary data for the optimizer. The optimizer used in this study is an algorithm based on evolution strategies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call