With the emerging deployment of deep neural networks, such as in mobile devices and autonomous cars, there is a growing demand for neural architecture search (NAS) to automatically design powerful network architectures. It is more reasonable to formulate NAS as a multi-objective optimization problem. In addition to prediction performance, multi-objective NAS (MONAS) problems take into account other criteria like the number of parameters and inference latency. Multi-objective evolutionary algorithms (MOEAs) are the preferred approach for tackling MONAS due to their effectiveness in dealing with multi-objective optimization problems. Recently, local search-based NAS algorithms have demonstrated their efficiency over MOEAs for MONAS problems. However, their performance has been only verified on bi-objective NAS problems. In this article, we propose a local search algorithm for multi-objective NAS (LOMONAS), an efficient local search framework for solving not only bi-objective NAS problems but also NAS problems having more than two objectives. We additionally present a parameter-less version of LOMONAS, namely IMS-LOMONAS, by combining LOMONAS with the Interleaved Multi-start Scheme (IMS) to help NAS practitioners avoid manual control parameter settings. Experimental results from a series of benchmark problems in the CEC’23 Competition demonstrate the competitiveness of LOMONAS and IMS-LOMONAS compared to MOEAs in tackling MONAS within both small-scale and large-scale search spaces. Source code is available at: https://github.com/ELO-Lab/IMS-LOMONAS.
Read full abstract