Abstract
AbstractA novel formulation of the wide kernel algorithm for partial least squares regression (PLSR) is proposed. We show how the elimination of redundant calculations in the traditional applications of PLSR helps in speeding up any choice of cross‐validation strategy by utilizing precalculated lookup matrices.The proposed lookup approach is combined with some additional computational shortcuts resulting in highly effective and numerically accurate cross‐validation results. The computational advantages of the proposed method are demonstrated by comparisons to the classical NIPALS and the bidiag2 algorithms for calculating cross‐validated PLSR models. Problems including both one and several responses, double/nested cross‐validated, and one‐vs‐all classification are among the considered applications.
Highlights
The partial least squares regression (PLSR) method was introduced to the field of chemometrics in the early 1980s
The present work introduces a novel kernel-based algorithm designed for obtaining efficient PLSR model selections based on arbitrary cross-validation strategies that are much faster than the traditional cross-validation approaches to PLSR model selection
We show parsimonious kernel PLS (PKPLS) with the quick cross-validation, on abbreviated PKPLS(QCV), PKPLS with ordinary cross-validation, NIPALS PLS, and PLS by the bidiag2-algorithm
Summary
The partial least squares regression (PLSR) method was introduced to the field of chemometrics in the early 1980s.1-4 the computational ideas of PLSR were discovered much earlier and are equivalent to a much older implementation of the conjugate gradient method of Hestenes and Stiefel[5] for numerically solving the normal equations of an ordinary least squares (OLS) problem; see Phatak and de Hoog.[6]Later, numerous algorithms for implementing PLSR have been proposed. The partial least squares regression (PLSR) method was introduced to the field of chemometrics in the early 1980s.1-4. The computational ideas of PLSR were discovered much earlier and are equivalent to a much older implementation of the conjugate gradient method of Hestenes and Stiefel[5] for numerically solving the normal equations of an ordinary least squares (OLS) problem; see Phatak and de Hoog.[6]. Numerous algorithms for implementing PLSR have been proposed. The present work introduces a novel kernel-based algorithm designed for obtaining efficient PLSR model selections based on arbitrary cross-validation strategies that are much faster than the traditional cross-validation approaches to PLSR model selection
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.