Abstract

In neural networks, finding optimal values for the number of hidden neurons and connection weights simultaneously is considered a challenging task. This is because altering the hidden neurons substantially impacts the entire structure of a neural network and increases the complexity of training process that requires special considerations. In fact, the number of variables changes proportional to the number of hidden nodes when training neural networks. As one of the seminal attempts, a hybrid encoding scheme is first proposed to deal with the aforementioned challenges. A set of recent and well-regarded stochastic population-based algorithms is then employed to optimize the number of hidden neurons and connection weights in a single hidden feedforward neural network (FFNN). In the experiments, twenty-three standard classification datasets are employed to benchmark the proposed technique qualitatively and quantitatively. The results show that the hybrid encoding scheme allows optimization algorithms to conveniently find the optimal values for both the number of hidden nodes and connection weights. Also, the recently proposed grey wolf optimizer (GWO) outperformed other algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call