Abstract

This work explores a hypothesis for the observation that the accuracy of Deep Neural Networks (DNNs) increases with the depth of the network. The aim of the project is to count the number of exact solutions to a simplified DNN problem. A finite family of DNN functions is defined so that the number of solutions as a function of depth can be counted. Through construction of these DNN solutions, a lower bound and an approximate rate of growth can be found for the number of solutions. This function indicates that the number of solutions grows rapidly with depth, which may offer some incite into why the accuracy of deep neural networks (DNNs) increases with the depth of the network.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call