Abstract

In this paper authors propose the technique, which decreases average forecast error of regression based models. The main idea of the method is to use the weighted sum of several regression equations, which satisfy Ordinary Least Squares prerequisites and have independent residuals, instead of only one. It is shown that if all method requirements are met, it is possible to decrease Mean Squared Error almost by half, using just three equations. This technique allows deriving equations which contain more predictors than the number of observations. Additionally, this method proves to be more consistent in time than any of regressions, used in it, separately. It is also illustrated, that the proposed method outperforms the regression equation, computed with the same independent variables, and, thus, it gives more accurate estimators of regression coefficients. Empirical results are provided as well.

Highlights

  • When modeling a social or economic process a researcher nearly always encounters an uncertainty whether created model will work in the future with the same effectiveness, what is partly highlighted by James Stock and Mark Watson (2007), Alhamzawi, R and Yu, K. (2012), Clark, T.E., McCracken, M.W. (2009), Gneiting, T. (2011) and Giordani, P., Kohn, R., van Dijk, D. (2007)

  • That is why we devote this paper to development of the method that could grasp more explanatory variables and significantly decrease forecast error without violating the classical way of regression model specification

  • In the paper we try to make a small step on the way of improving existing methods of forecasting economic processes

Read more

Summary

Introduction

When modeling a social or economic process a researcher nearly always encounters an uncertainty whether created model will work in the future with the same effectiveness, what is partly highlighted by James Stock and Mark Watson (2007), Alhamzawi, R and Yu, K. (2012), Clark, T.E., McCracken, M.W. (2009), Gneiting, T. (2011) and Giordani, P., Kohn, R., van Dijk, D. (2007). Either unaccounted factors changed their values so, that coefficients estimators become biased, or accounted factors change the extent of their impact on the output variable. It can be a combination of both, see for example Orphanides, A. and S. van Norden (2005) and Primiceri, G. In order to decrease those errors a researcher can work out a model which takes into account structural breaks and coefficients variability in regression equation, refer for example to Jan J.J. Groen, Richard Paap and Francesco Ravazzolo (2009) and Sensier, M., Van Dijk, D. That is why we devote this paper to development of the method that could grasp more explanatory variables and significantly decrease forecast error without violating the classical way of regression model specification

Empirical Background of the Method
The Method
Application to Real Data
Findings
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.