Abstract

Kernel partial least squares (KPLS) is an effective nonlinear modeling technique for control engineering applications, including model predictive control, process monitoring or general system diagnosis. It can deal with small sample sizes and variable sets that are noisy and highly correlated. Kernel partial least squares maps the input (or cause) variables to a feature space and carries out the task of producing an optimal prediction model for the process output (or effect) variables using the standard linear partial least squares (PLS) approach. Resulting from the typically large size of the feature space, the kernel partial least squares procedure can be computational intensive. In particular, if the optimal model structure is estimated using cross-validation, KPLS is not efficient in handling large data sets. This paper first modifies the conventional kernel partial least squares procedure in order to embed it within a leave-one-out cross-validation (LOOCV) framework. The proposed efficient kernel partial least squares (EKPLS) is able to reduce the computational complexity by an order of magnitude compared to the conventional approach, which is proven both analytically and through modeling applications to three industrial data sets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call