Abstract

AbstractThe effects of ensemble learning methods, bagging and boosting, on kernel partial least squares (KPLS) regression are investigated. In bagged KPLS and boosting KPLS, a series of weak hypothesis with picked out samples from a training set are made with KPLS algorithm. Finally, a regression model is developed as an ensemble of these hypotheses. The abilities of bagged KPLS and boosting KPLSR are investigated with two near‐infrared (NIR) spectroscopic data sets, NIR diffuse reflectance spectra of dried tobacco leaves and NIR transmission spectra of Thai fish sauces, by comparing with other methods, standard partial least squares (PLS), bagged PLS, and boosting PLS and KPLS. The results reveal that bagged KPLS and boosting KPLS yield superior regression performances to standard PLS. Especially, boosting KPLS indicates clear improvements over to PLS, bagged PLS boosting PLS, KPLS and bagged KPLS. Copyright © 2007 John Wiley & Sons, Ltd.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call