Abstract

Neural architecture search can help researchers design excellent neural network structure. But it takes a lot of time, such as using a neural architecture search method based on reinforcement learning, which requires more than 3000 GPU hours to find an excellent architecture on the CIFAR-10. And in order to be able to use the back-propagation method during training, the architecture will be continuous. Therefore, we propose a neural network architecture search algorithm based on Particle Swarm Optimization (PSO) – PNAS. First, we need to train a super-net. Through random sampling during the super-net training process, only one path training is activated at a time, which greatly reduces the coupling between the super-net nodes. After the super-net training, we use the PSO algorithm to search the architecture of the neural network to find optimal architecture.Our PSO-based neural architecture search can achieve competitive speed compared to state-of-the-art models. Our PNAS search time is faster than GDAS 28% and the parameters are also less than GDAS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call