Abstract

In this study, a new procedure to determine the optimum activation function for a neural network is proposed. Unlike previous methods of optimising activation functions, the proposed approach regards selection of the most suitable activation function as a discrete optimisation problem, which involves generating various combinations of function then evaluating their performance as activation functions in a neural network, returning the function or combination of functions which yields best result as the optimum. The efficacy of the proposed optimisation method is compared with conventional approaches using the data generated from several synthetic functions. Numerical results indicate that the network produced using the proposed method achieves a better accuracy with a smaller network size, compared to other approaches.Bridge scour problem is used to further demonstrate the performance of the proposed algorithm. Based on the training and validation results, a better estimation of both equilibrium and time dependent scour depth is produced by the neural network developed using the proposed optimisation method, compared to networks with a priori chosen activation functions. Furthermore, the performance of the proposed model is compared with predictions of empirical methods, with the former making more accurate predictions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call