Abstract

Due to the rapid development of Artificial Neural Networks (ANN) models, the number of hyperparameters constantly grows. With such a number of parameters, it’s necessary to use automatic tools for building or adapting new models for new problems. It leads to the expansion of Neural Architecture Search (NAS) methods usage, which performs hyperparameters optimisation in a vast space of model hyperparameters, so-called hyperparameters tuning. Since modern NAS techniques are widely used to optimise models in different areas or combine many models from previous experiences, it requires a lot of computational power to perform specific hyperparameters optimisation routines. Despite the highly parallel nature of many NAS methods, they still need a lot of computational time to converge and reuse information from the generations of previously synthesised models. Therefore it creates demands for parallel implementations to be available in different cluster configurations and utilise as many nodes as possible with high scalability. However, simple approaches when the NAS solving is performed without considering results from the previous launches lead to inefficient cluster utilisation. In this article, we introduce a new approach of optimisation NAS processes, limiting the search space to reduce the number of search parameters and dimensions of the search space, using information from the previous NAS launches that allow decreasing demands of computational power and improve cluster utilisation as well.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.