Abstract

In the last decade, deep learning architectures have provided good accuracy as they become deeper and wider in addition to other theoretical improvements. However, despite their current success, they initially faced with overfitting issue that limits their usage. The first practical and usable solution to overfitting in deep neural networks is a simple approach known as the dropout. Dropout is a regularization approach that randomly drops connections from earlier layers during training of neural nets. Dropout is a widely used technique, especially in image classification, speech recognition and natural language processing tasks, where features created by earlier layers are mostly redundant. Usage of the dropout layer in other tasks is largely unexplored. In this study, we seek an answer to question if the dropout layer is also useful for classical regression problems. A 3 layer deep learning net with a single dropout layer with various dropout levels tested on 8 real regression datasets. According to the experiments, the dropout layer does not help over fitting.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.