Abstract

Neural Architecture Search (NAS) has recently shown a powerful ability to engineer networks automatically on various tasks. Most current approaches navigate the search direction with the validation performance-based architecture evaluation methodology, which estimates an architecture's quality by training and validating on a specific large dataset. However, for small-scale datasets, the model's performance on the validation set cannot precisely estimate that on the test set. The imprecise architecture evaluation can mislead the search to sub-optima. To address the above problem, we propose an efficient multi-objective evolutionary zero-shot NAS framework by evaluating architectures with zero-cost metrics, which can be calculated with randomly initialized models in a training-free manner. Specifically, a general zero-cost metric design principle is proposed to unify the current metrics and help develop several new metrics. Then, we offer an efficient computational method for multi-zero-cost metrics by calculating them in one forward and backward pass. Finally, comprehensive experiments have been conducted on NAS-Bench-201 and MedMNIST. The results have shown that the proposed method can achieve sufficiently accurate, high-throughput performance on MedMNIST and 20[Formula: see text]faster than the previous best method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.