Abstract

Extreme learning machine (ELM) is suitable for nonlinear soft sensor development. Yet it faces an overfitting problem. To overcome it, this work integrates bound optimization theory with variational Bayesian (VB) inference to derive novel L1 norm-based ELMs. An L1 term is attached to the squared sum cost of prediction errors to formulate an objective function. Considering the nonconvexity and nonsmoothness of the objective function, this article uses bound optimization theory, and constructs a proper surrogate function to equivalently convert a challenging L1 norm-based optimization problem into easy one. Then, VB inference is adopted for optimizing the converted problem. Thus, an L1 norm-based ELM can be efficiently optimized by an alternating optimization algorithm with a proved convergence. Finally, a soft sensor is developed based on the proposed algorithm. An industrial case study is carried out to demonstrate that the proposed soft sensor is competitive against recent ones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call