Abstract

AbstractResearch into automatically searching for an optimal neural network (NN) by optimisation algorithms is a significant research topic in deep learning and artificial intelligence. However, this is still challenging due to two issues: Both the hyperparameter and architecture should be optimised and the optimisation process is computationally expensive. To tackle these two issues, this paper focusses on solving the hyperparameter and architecture optimization problem for the NN and proposes a novel light‐weight scale‐adaptive fitness evaluation‐based particle swarm optimisation (SAFE‐PSO) approach. Firstly, the SAFE‐PSO algorithm considers the hyperparameters and architectures together in the optimisation problem and therefore can find their optimal combination for the globally best NN. Secondly, the computational cost can be reduced by using multi‐scale accuracy evaluation methods to evaluate candidates. Thirdly, a stagnation‐based switch strategy is proposed to adaptively switch different evaluation methods to better balance the search performance and computational cost. The SAFE‐PSO algorithm is tested on two widely used datasets: The 10‐category (i.e., CIFAR10) and the 100−category (i.e., CIFAR100). The experimental results show that SAFE‐PSO is very effective and efficient, which can not only find a promising NN automatically but also find a better NN than compared algorithms at the same computational cost.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call