Abstract

Deep neural networks (DNNs) need intensive tuning of their configurations such as network structures and learning conditions. The tuning is a type of black-box optimization problem where evolutionary algorithms are applicable. A distinctive property in evolutionary optimization of DNN configurations is that there is a double structure in the optimization; the evolutionary algorithm optimizes a chromosome representing the DNN configuration while an individual DNN with the configuration learns from training data typically by back-propagation. With an aim to obtain better-optimized DNNs by evolutionary algorithms, we propose a dual inheritance evolution strategy based on an analogy to human brain evolution where gene and culture co-evolves. The proposed method is an extension of a conventional evolution strategy by introducing an additional pass to directly propagate culture or knowledge from ancestor DNNs to descendant DNNs by integrating teacher-student learning. We apply the proposed method to the automatic tuning of an end-to-end neural network-based speech recognition system. Experimental results show that the proposed method produces a smaller model with higher recognition performance than a baseline optimization based on the Covariance Matrix Adaptation Evolution Strategy (CMA-ES).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.