Abstract

This research used a hard-cut iterative training algorithm to improve a Gaussian process mixture (GPM) model. Our enhanced GPM (EGPM) concisely estimates distribution parameters to the greatest extent possible. GPM models are powerful tools for data presentation and forecasting owing to their linear mix of multiple Gaussian process (GP) models. The hidden posterior probability distribution variables in the GPM model, which are based on the hard-cut algorithm, are 0 and 1, respectively, which can simplify the training process and reduce calculation requirements by training each GP via a maximum likelihood estimation method. The EGPM model is then used for a short-term electric load forecasting problem and compared with various forecasting models. First, the EGPM results are compared with those of two previous GPM training algorithms: the variational and leave-one-out cross validation (LOOCV) algorithms. The experimental results indicate that the EGPM model can accurately and more reliably forecast electric loads. The GP, support vector machine, and radial basis function network are also assessed for their ability to solve the short-term electric load forecasting problem. The empirical results indicate that the performance of the proposed EGPM is superior to that of the other methods in terms of forecasting accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.