Abstract

Neural Architecture Search Without Training (NASWOT) has been proposed recently to replace the conventional Neural Architecture Search (NAS). Pioneer works only deploy one or two indicator(s) to search. Nevertheless, the quantitative assessment for indicators is not fully studied and evaluated. In this paper, we first review several indicators, which are used to evaluate the network in a training-free manner, including the correlation of Jacobian, the output sensitivity, the number of linear regions, and the condition number of the neural tangent kernel. Our observation is that each indicator is responsible for characterizing a network in a specific aspect and there is no single indicator that achieves good performance in all cases, e.g. highly correlated with the test accuracy. This motivated us to develop a novel indicator where all properties of a network are taken into account. To obtain better indicator that can consider various characteristics of networks in a harmonized form, we propose a Fusion Indicator (FI). Specifically, the proposed FI is formed by combining multiple indicators in a weighted sum manner. We minimize the mean squared error loss between the predicted and actual accuracy of networks to acquire the weights. Moreover, as the conventional training-free NAS researches used limited metrics to evaluate the quality of indicators, we introduce more desirable metrics that can evaluate the quality of training-free NAS indicator in terms of fidelity, correlation and rank-order similarity between the predicted quality value and actual accuracy of networks. That is, we introduce the Pearson Linear Coefficient Correlation (PLCC), the Root Mean Square Error (RMSE), the Spearman Rank-Order Correlation Coefficient (SROCC), and Kendall Rank-Order Correlation Coefficient (KROCC). Extensive experiments on NAS-Bench-101 and NAS-Bench-201 demonstrate the effectiveness of our FI, outperforming existing methods by a large margin.

Highlights

  • D EEP neural networks (DNNs) have shown remarkable performance on various computer vision tasks

  • We propose a Fusion Indicator (FI) which is a combination of output from multiple indicators with learned weights

  • In order to design a better indicator, we study several methods that can evaluate some characteristics of the network before training, i.e., correlation of Jacobian (CJ), output sensitivity (OS), number of Linear Regions (NLR), and condition number of Neural Tangent Kernel (CNNTK)

Read more

Summary

Introduction

D EEP neural networks (DNNs) have shown remarkable performance on various computer vision tasks. Since the success of AlexNet on ImageNet [1] classification task in 2012 [2], many high-performance networks have been introduced [3][4][5] where these networks have been designed by experts. Manual design is not an optimal choice especially when the network goes deeper. The process for designing these networks requires immense time and effort. To reduce the cost of designing the network, researchers have studied to automate the process, leading to Neural Architecture Search (NAS). Instead of designing the architecture, experts design the search algorithms that find good candidates (e.g., the number of layers, filters and types of activation, etc,.) on a given search space

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.