Abstract
Projection to latent structures or partial least squares (PLS) is one of the most powerful linear regression techniques to deal with noisy and highly correlated data. To address the inherent nonlinearity present in the data from many industrial applications, a number of approaches have been proposed to extend it to the nonlinear case, and most of them rely on the nonlinear approximation capability of neural networks. However, these methods either merely introduce nonlinearities to the inner relationship model within the linear PLS framework or suffer from training a complicated network. In this paper, starting from an equivalent presentation of PLS, both nonlinear latent structures and nonlinear reconstruction are obtained straightforwardly through two consecutive steps. First, a radial basis function (RBF) network is utilized to extract the latent structures through linear algebra methods without the need of nonlinear optimization. This is followed by developing two feed-forward networks to reconstruct t...
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.