Abstract

Deep learning has achieved enormous breakthroughs in the field of image recognition. However, due to the time-consuming and error-prone process in discovering novel neural architecture, it remains a challenge for designing a specific network in handling a particular task. Hence, many automated neural architecture search methods are proposed to find suitable deep neural network architecture for a specific task without human experts. Nevertheless, these methods are still computationally/economically expensive, since they require a vast amount of computing resource and/or computational time. In this paper, we propose several network morphism mutation operators with extra noise, and further redesign the macro-architecture based on the classical network. The proposed methods are embedded in an evolutionary algorithm and tested on CIFAR-10 classification task. Experimental results indicate the capability of our proposed method in discovering powerful neural architecture which has achieved a classification error 2.55% with only 4.7M parameters on CIFAR-10 within 12 GPU-hours.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.