Abstract

Deep neural network techniques are recently recognized as powerful tools in solving complex and challenging modeling problems of microwave components. However, direct training of a fully connected deep neural network with sigmoid functions using the backpropagation (BP) algorithm is difficult because of the vanishing gradient problem. In this paper, we propose a novel deep neural network modeling technique with batch normalization (BN) to address the vanishing gradient problem. BN layers are added before every sigmoid hidden layer of the deep neural network to normalize the inputs of each sigmoid hidden layer with additional scaling and shifting, thus overcoming the vanishing gradient problem. Automated model generation (AMG) algorithm is also utilized to automatically determine the suitable number of BN layers and sigmoid hidden layers during deep neural network training process. This proposed technique is illustrated by two microwave examples.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.