Abstract
Convolutional neural networks (CNNs) are a superb computing paradigm in deep learning, and their architectures are considered to be the key to performance breakthroughs in various tasks. Recently, neural architecture search (NAS) methods have been proposed to automate the process of network architecture design, many of which have discovered novel CNN architectures that are superior to human-designed ones. However, most of the current NAS methods suffer from either prohibitively high computational complexity of resulting architectures which have inadvertently affected the deep model deployment, or limitations which impede the flexibility of architecture design. To address these deficiencies, this work proposes an evolutionary computation (EC) based method for compact and flexible NAS. A valuable search space with parameter-efficient mobile inverted bottleneck convolution blocks as the primitive components is proposed to ensure the initial quality of the compact architectures. In addition, a two-level variable-length particle swarm optimization (PSO) approach is devised to evolve both the micro-architecture and macro-architecture of CNNs. Further, this study proposes an effective scheme by integrating multiple computationally reduced methods to greatly speed up the evaluation process. Experimental results on the CIFAR-10, CIFAR-100 and ImageNet datasets show the superiority of the proposed method against the state-of-the-art algorithms in terms of the classification performance, search cost, and resulting architecture complexity.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have