Abstract

The performance of an artificial neural network (ANN) depends upon the selection of proper connection weights, network architecture, and cost function during network training. This paper presents a hybrid approach (GaTSa) to optimize the performance of the ANN in terms of architecture and weights. GaTSa is an extension of a previous method (TSa) proposed by the authors. GaTSa is based on the integration of the heuristic simulated annealing (SA), tabu search (TS), genetic algorithms (GA), and backpropagation, whereas TSa does not use GA. The main advantages of GaTSa are the following: a constructive process to add new nodes in the architecture based on GA, the ability to escape from local minima with uphill moves (SA feature), and faster convergence by the evaluation of a set of solutions (TS feature). The performance of GaTSa is investigated through an empirical evaluation of 11 public-domain data sets using different cost functions in the simultaneous optimization of the multilayer perceptron ANN architecture and weights. Experiments demonstrated that GaTSa can also be used for relevant feature selection. GaTSa presented statistically relevant results in comparison with other global and local optimization techniques.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.