Abstract

Boosting is one of the most important recent developments in classification methodology. It can significantly improve the prediction performance of any single classification algorithm and has been successfully applied to many different fields including problems in chemometrics. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data, and then taking a weighted majority vote of the sequence of classifiers thus produced. In this paper, we proposed a generalized boosting algorithm via Bayes optimal decision rule. Using Bayes optimal decision rule, we adjust the weights of the sequence of classifiers in the voting process of boosting algorithm. The two types of errors are introduced into the generalized boosting and make the voting process more sensible. Meanwhile, the weights of the training samples are also correspondingly adjusted according to some criterion. The generalized boosting is applied to the binary classification for chemical data. Experimental results show that it can improve the predict accuracy compared with AdaBoost algorithm especially when the difference between the two types of errors for classification is large.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.