Abstract

The exciting topic on the Gaussian process is hyper-parameter estimation. The simple and powerful method for the hyper-parameter estimation is Markov Chain Monte Carlo (MCMC). In the internet era, we need to consider the influence of new data coming to hyper-parameter of the Gaussian Process in online manner, updating is needed. Meanwhile, the main problem of running hyper-parameter estimation using MCMC is time-computation. The large size of covariance matrices on the Gaussian process has an expensive time-computing if should be combined to MCMC. We proposed the Less-Last Number Hyper-parameter (LLNH) algorithm for less time computing MCMC. The idea is merging two Markov Chain Monte Carlo (MCMC) sub-posterior algorithm iteratively. The first sub-posterior results recent hyper-parameter estimation. Meanwhile, the second sub-posterior run MCMC for <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$m$</tex> last data points including new data coming iteratively, <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$m$</tex> -online sub-data ( <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$m$</tex> « data size), using the recent hyper-parameter estimation. The merging is to estimates new hyper-parameter estimation. Technically, MCMC runs m-data size on every iteration for less time-computing. Moreover, MCMC is started by the recent estimation to represent the MCMC involving left data. The algorithm is an efficient method for the hyper-parameter estimation of Gaussian process regression on iterative real-time data. It is applicable to the online scheme. The result showed that the LLNH algorithm performs well on hyper-parameter estimation and has less time-computation comparing to the offline MCMC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call