Abstract

Just-in-time learning (JITL) is one of the most widely used strategies for soft sensor modeling in nonlinear processes. However, traditional JITL methods have difficulty in dealing with data samples that contain missing values. Meanwhile, data noises and uncertainties have not been taken into consideration for relevant sample selection in existing JITL approaches. To overcome these problems, a new probabilistic JITL (P-JITL) framework is proposed in this brief. In P-JITL, variational Bayesian principal component analysis is first utilized to handle missing values and extract Gaussian posterior distributions of latent variables. Then, symmetric Kullback-Leibler divergence is creatively employed to measure the dissimilarity of two distributions for relevant sample selection in the JITL framework. Finally, a nonlinear regression model, Gaussian process regression, is carried out to model the nonlinear relationship between the output and the extracted latent variables. In this way, the proposed probabilistic JITL (P-JITL) is able to deal with missing data and select relevant samples more accurately. To evaluate the effectiveness and flexibility of P-JITL, comparative studies between P-JITL and traditional deterministic JITL (D-JITL) are carried out on a numerical example and an industrial application example, in which missing data are simulated with percentages from 0% to 50%. The results show that P-JITL can provide more accurate prediction accuracy than D-JITL in each scenario considered.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call