Abstract

The great success of deep learning poses urgent challenges for understanding its working mechanism and rationality. The depth, structure, and massive size of the data are recognized to be three key ingredients for deep learning. Most of the recent theoretical studies for deep learning focus on the necessity and advantages of depth and structures of neural networks. In this article, we aim at rigorous verification of the importance of massive data in embodying the outperformance of deep learning. In particular, we prove that the massiveness of data is necessary for realizing the spatial sparseness, and deep nets are crucial tools to make full use of massive data in such an application. All these findings present the reasons why deep learning achieves great success in the era of big data though deep nets and numerous network structures have been proposed at least 20 years ago.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.