Abstract
At present, great attentions have been paid to multi-objective neural architecture search (NAS) and resource-aware NAS for their comprehensive consideration of the overall evaluation of architectures, including inference latency, precision, and model scale. However NAS also exacerbates the ever-increasing cost (engineering, time complexity, computation resource). Aiming to alleviate this, the reproducible NAS research releases the benchmark, which includes the metrics (e.g. Accuracy, Latency, and Parameters) of representative models from the typical search space on specific tasks. Motivated by the multi-objective NAS, resource-aware NAS, and reproducible NAS, this paper dedicates to binary-relation prediction (Latency, Accuracy), which is a more reasonable and effective way to satisfy the general NAS scenarios with less cost. We conduct a reproducible NAS study on the MobileNet-based search space and release the dataset. Further, we first propose the modeling of common features among prediction tasks (Latency, Accuracy, Parameters, and FLOPs), which will facilitate the prediction of individual tasks, and creatively formulate the architecture ranking prediction with a multi-task learning framework. Eventually, the proposed multi-task learning based binary-relation prediction model reaches the performance of 94.3% on Latency and 85.02% on Top1 Accuracy even with only 100 training points, which outperforms the single-task learning based model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.