Abstract

Geometric Semantic Genetic Programming (GSGP) is a recently proposed form of Genetic Programming in which the fitness landscape seen by its variation operators is unimodal with a linear slope by construction and, consequently, easy to search. This is valid across all supervised learning problems. In this paper we propose a feedforward Neural Network construction algorithm derived from GSGP. This algorithm shares the same fitness landscape as GSGP, which allows an efficient search to be performed on the space of feedforward Neural Networks, without the need to use backpropagation. Experiments are conducted on real-life multidimensional symbolic regression datasets and results show that the proposed algorithm is able to surpass GSGP, with statistical significance, in terms of learning the training data. In terms of generalization, results are similar to GSGP.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call