Abstract

In this article we introduce the idea of Markov resampling for Boosting methods. We first prove that Boosting algorithm with general convex loss function based on uniformly ergodic Markov chain (u.e.M.c.) examples is consistent and establish its fast convergence rate. We apply Boosting algorithm based on Markov resampling to Support Vector Machine (SVM), and introduce two new resampling-based Boosting algorithms: SVM-Boosting based on Markov resampling (SVM-BM) and improved SVM-Boosting based on Markov resampling (ISVM-BM). In contrast with SVM-BM, ISVM-BM uses the support vectors to calculate the weights of base classifiers. The numerical studies based on benchmark datasets show that the proposed two resampling-based SVM Boosting algorithms for linear base classifiers have smaller misclassification rates, less total time of sampling and training compared to three classical AdaBoost algorithms: Gentle AdaBoost, Real AdaBoost, Modest AdaBoost. In addition, we compare the proposed SVM-BM algorithm with the widely used and efficient gradient Boosting algorithm-XGBoost (eXtreme Gradient Boosting), SVM-AdaBoost and present some useful discussions on the technical parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call