Abstract

The global warming problem caused by greenhouse gas (GHG) emissions has aroused wide public concern. In order to give policy makers more power to set the specific target of GHG emission reduction, we propose an ensemble learning method with the least squares boosting (LSBoost) algorithm for the kernel-based nonlinear multivariate grey model (KGM) (1, N), and it is abbreviated as BKGM (1, N). The KGM (1, N) has the ability to handle nonlinear small-sample time series prediction. However, the prediction accuracy of KGM (1, N) is affected to an extent by selecting the proper regularization parameter and the kernel parameter. In boosting scheme, the KGM (1, N) is used as a base learner, and the use of early stopping method avoids overfitting the training dataset. The empirical analysis of forecasting GHG emissions in 27 European countries for the period 2015–2019 is carried out. Overall error analysis indicators demonstrate that the BKGM (1, N) provides remarkable prediction performance compared with original KGM (1, N), support vector regression (SVR), and robust linear regression (RLR) in estimating GHG emissions.

Highlights

  • Introduction e excessivegreenhouse gas (GHG) emissions led to the global warming and the frequent occurrence of extreme climate disasters

  • For the kernel-based nonlinear multivariate grey model, we propose an ensemble learning method with least squares boosting (LSBoost) algorithm [31], an instantiation of GradientBoost. e main contributions of our study are summarized as follows: (1) We regard the multivariable grey forecasting model as one that can solve the multiple regression problem with the characteristics of time series, that provides a research way to improve the prediction accuracy of multivariable grey forecasting models through ensemble learning

  • Least Squares Boosting Algorithm. e LSBoost algorithm adjusts weight values through repeated training with the aim to minimize the error between target variable and the aggregated prediction of the base learners, and obtains the final result based on iterated weight. e detailed procedure of LSBoost Algorithm 1 is described as follows: Input: initial value F0(x), the maximum iteration number M, training dataset size N, learning rate ]

Read more

Summary

Kernel-Based Nonlinear Multivariate

Grey Model. e modelling procedures of the KGM (1, N) are given as follows [25]: S(10) is the system characteristic sequence, and S(i 0)(i. E modelling procedures of the KGM (1, N) are given as follows [25]: S(10) is the system characteristic sequence, and S(i 0)(i. Where b is a bias, and φ(k) is represented as φ(k) wTφ􏼒􏽨S(21)(k), . Where the sequences can be mapped into a higher dimensional feature space by φ, and w is a weight vector. The following convex optimization problem is constructed:. S(N1)(k)􏽩T􏼓 − b, where the regularization parameter C is a positive number. To solve this convex optimization problem, we utilize the Lagrange multiplier method and the Gaussian kernel function, which is expressed as. 2σ where both Xu and Xv are the column vectors, and σ is the kernel parameter

Least Squares
Boosting Kernel-Based Nonlinear Multivariate
Application in Forecasting GHG Emissions
Numerical Simulation and Predicted Results
Effect of Learning Rate on the Computational Cost and the Generalization Ability
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call