Abstract
The selection of model architecture and hyperparameters has a significant impact on the diagnostic performance of most deep learning models. Because training and evaluating the various architectures of deep learning models is a time-consuming procedure, manual selection of model architecture becomes infeasible. Therefore, we have proposed a novel framework for evolutionary deep neural networks that uses a policy gradient to guide the evolution of the DNN architecture towards maximum diagnostic accuracy. We have formulated a policy gradient-based controller that generates an action to sample the new model architecture at every generation so that optimality is obtained quickly. The fitness of the best model obtained is used as a reward to update the policy parameters. Also, the best model obtained is transferred to the next generation for quick model evaluation in the NSGA-II evolutionary framework. Thus, the algorithm gets the benefits of fast non-dominated sorting as well as quick model evaluation. The effectiveness of the proposed framework has been validated on three datasets: the Air Compressor dataset, the Case Western Reserve University dataset, and the Paderborn University dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.