Abstract

The boosting model is a kind of ensemble learning technology, including XGBoost and GBDT, which take decision trees as weak classifiers and achieve better results in classification and regression problems. The neural network has an excellent performance on image and voice recognition, but its weak interpretability limits on developing a fusion model. By referring to principles and methods of traditional boosting models, we proposed a Neural Network Boosting (NNBoost) regression, which takes shallow neural networks with simple structures as weak classifiers. The NNBoost is a new ensemble learning method, which obtains low regression errors on several data sets. The target loss function of NNBoost is approximated by the Taylor expansion. By inducing the derivative form of NNBoost, we give a gradient descent algorithm. The structure of deep learning is complex, and there are some problems such as gradient disappearing, weak interpretability, and parameters difficult to be adjusted. We use the integration of simple neural networks to alleviate the gradient vanishing problem which is laborious to be solved in deep learning, and conquer the overfitting of a learning algorithm. Finally, through testing on some experiments, the correctness and effectiveness of NNBoost are verified from multiple angles, the effect of multiple shallow neural network fusion is proved, and the development path of boosting idea and deep learning is widened to a certain extent.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call