Abstract

Multi-Objective Evolutionary Neural Architecture Search (MOENAS) methods employ evolutionary algorithms to approximate a set of architectures representing optimal trade-offs between network performance and complexity. Directly estimating network performance via error rates or losses incurs long runtimes due to the computationally expensive network training procedure. Instead, low-cost metrics that require no network training have been proposed as a proxy for network performance. However, these metrics might exhibit inconsistent correlations with network performance across different search spaces. The influences of training-based and training-free metrics on the effectiveness and efficiency of MOENAS are still under-explored.We introduce the Enhanced Training-Free MOENAS (E-TF-MOENAS) that employs the widely-used NSGA-II as the search algorithm and optimizes multiple training-free performance metrics as separate objectives. Experiments on NAS-Bench-101 and NAS-Bench-201 show that E-TF-MOENAS outperforms training-free methods that use a single training-free performance metric and could obtain comparable results to training-based methods but with approximately 30 times less computation cost. E-TF-MOENAS obtains architectures in NAS-Bench-201 with state-of-the-art mean accuracies of 94.37%, 73.50%, and 46.62% for CIFAR-10, CIFAR-100, and ImageNet16-120, respectively, within less than 3 GPU hours. It is beneficial to utilize multiple training-free proxy metrics simultaneously and E-TF-MOENAS provides a convenient framework for building such an efficient NAS approach. The source code can be found at https://github.com/ELO-Lab/E-TF-MOENAS.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.