Abstract

Neural Architecture Search (NAS) is the process of automating the design of neural network architectures for a given task. Although NAS provides automates the process of finding suitable neural network architectures for a specific task, the existing NAS algorithms are immensely time-consuming. The main bottleneck in NAS algorithms is the training time for each architecture. This study proposes an Improved Grey Wolf Optimization based on Synaptic Saliency (IGWO-SS), which is much faster than the existing NAS algorithms and provides better final performance. The IGWO-SS algorithm skips training the less promising architectures by creating a relative rank between the architectures based on synaptic saliency. The architectures that are lower in rank are considered less promising than the architectures that are higher in rank. Since the calculation of synaptic saliency is a very fast process, a significant amount of time is saved by skipping training of less promising architectures. We performed extensive experiments to determine the efficacy of synaptic saliency in improving NAS. Our experimental results suggest that the synaptic saliency of an untrained neural network positively correlates with its final accuracy. Hence, it can be used to identify untrained promising neural networks. The experimental results suggest that the IGWO-SS algorithm is almost 10<i>x</i> faster and achieves better final performance than five other bio-inspired algorithms. The IGWO-SS algorithm achieves higher mean accuracy than state-of-the-art NAS algorithms, including - REA, RS, RL, BOHB, DARTSV1, DARTSV2, GDAS, SETN, and ENAS. We hope that our work will make NAS more accessible and useful to researchers by reducing the time and resources required to perform NAS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call