Abstract

This paper presents a two-stage approach, denoted as CBRDE-LM, to evolve the architecture and weights of feedforward artificial neural networks. In the first stage, a collaborative binary-real differential evolution (CBRDE) is used to optimize simultaneously network architecture and connection weights of an ANN by a specific individual representation and evolutionary scheme, in which the structure is indirectly represented as binary coding and connection weights are directly encoded by real-valued coding. In the second stage, based on the resulting architecture and weights of an ANN, Levenberg-Marquardt (LM) backpropagation algorithm is adopted for fine-tuning ANN weights. The performance of the two-stage approach has been evaluated on several benchmarks. The results demonstrate that the two-stage approach can fast produce compact ANNs with good generalization ability at low computational cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call