Abstract

Searching neural networks' (NN) optimal parameters can be seen as a multi-modal optimization problem. This paper proposes a memetic water wave optimization algorithm (WWOMA) to determine the optimal weights of NN. In the proposed WWO-based MA (WWOMA), we employ WWO to perform global search by both individual improvement and population co-evolution, and then employ several local search components to enhance its local refinement ability. Furthermore, a Meta-Lamarckian learning strategy is utilized to choose a proper local refinement component to concentrate computational efforts on more promising solutions. We carry out numerical experiments on six well-known NN designing benchmark problems, the empirical results demonstrate both the feasibility and effectiveness by applying WWOMA to design NN.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.