Abstract
AdaBoost algorithm is a typical Boosting algorithm, which belongs to a successful representative in the Boosting family. This algorithm can upgrade a weak classifier with a better classification effect than random classification to a strong classifier with high classification accuracy, where n_estimators represents the number of iterations of the base classifier. If the value is too large, it will easily cause the model to overfit, if it is too small, it is easy. The model is under-fitting, and the parameter setting is not set randomly, but according to the current status of the data set. Aiming at the problem that the number of iterations in the AdaBoost algorithm is uncertain, this paper introduces a Bayesian optimization algorithm for hyperparameter tuning, which makes the value of hyper parameter in AdaBoost algorithm suitable for the current data set, and finally obtains a hyperparameter optimization AdaBoost algorithm. The experiment result shows the method that adopt Bayesian optimization algorithm for hyperparameter optimization and apply the optimized hyperparameter value to the AdaBoost algorithm does not only improves the classification accuracy of the AdaBoost algorithm, but also avoids overfitting and underfitting of the model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.