Abstract

Searching optimal parameters for neural networks can be formulated as a multi-modal optimization problem. This paper proposes a novel water wave optimization (WWO)-based memetic algorithm to identify the optimal weights for neural networks. In the proposed water wave optimization-based memetic algorithm (WWOMA), we employ WWO to perform global search by both individual improvement and population co-evolution and then employ several local search components to enhance its local refinement ability. Moreover, an effective Meta-Lamarckian learning strategy is utilized to choose a proper local search component to concentrate computational efforts on more promising solutions. We carry out simulation experiments on six well-known neural network designing benchmark problems, both the simulation results and statistical comparisons demonstrate the feasibility, effectiveness and efficiency of applying WWOMA to design neural networks. Furthermore, we apply WWOMA to design neural networks and use well-trained neural networks to predict tensile strength of micro-alloyed steels. Evaluation on a practical industrial case with 2489 sample data shows that, in comparison with other algorithms, WWOMA-based neural networks can obtain notable and robust prediction accuracy, which further demonstrates that WWOMA is a promising and efficient algorithm for designing neural networks. It is worth mentioning that, to the best of our knowledge, this is the first report about applying water wave optimization to train neural networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call