Abstract

Background: In the linear regression model, the ordinary least square (OLS) estimator performance drops when multicollinearity is present. According to the Gauss-Markov theorem, the estimator remains unbiased when there is multicollinearity, but the variance of its regression estimates become inflated. Estimators such as the ridge regression estimator and the K-L estimators were adopted as substitutes to the OLS estimator to overcome the problem of multicollinearity in the linear regression model. However, the estimators are biased, though they possess a smaller mean squared error when compared to the OLS estimator. Methods:In this study, we developed a new unbiased estimator using the K-L estimator and compared its performance with some existing estimators theoretically, simulation wise and by adopting real-life data. Results: Theoretically, the estimator even though unbiased also possesses a minimum variance when compared with other estimators. Results from simulation and real-life study showed that the new estimator produced smaller mean square error (MSE) and had the smallest mean square prediction error (MSPE). This further strengthened the findings of the theoretical comparison using both the MSE and the MSPE as criterion. Conclusions: By simulation and using a real-life application that focuses on modelling, the high heating values of proximate analysis was conducted to support the theoretical findings. This new method of estimation is recommended for parameter estimation with and without multicollinearity in a linear regression model.

Highlights

  • Considering the general linear regression model y = Xβ + εi (1)such that εi is normally distributed with mean 0 and variance σ2I where I is the identity matrix. y is an n × 1 vector of dependent variable, X is an n × p matrix of the independent variables, β is a p × 1 vector of unknown regression parameters of interest

  • When the ordinary least square (OLS) estimator is applied to a model where there is correlation between the independent variables, the variance of the regression estimates becomes inflated[1,2]

  • The limitations of these estimators is that they are biased, the unbiased versions of some of them have been developed. The advantage of these estimators is that they produced estimates that were similar to the OLS estimator with better mean squared error

Read more

Summary

Introduction

In addressing the problem of multicollinearity, various biased estimators with mean square error smaller than the OLS have been developed by different authors[2,3,4,5,6,7,8,9,10,11,12,13,14,15] The limitations of these estimators is that they are biased , the unbiased versions of some of them have been developed. Conclusions: By simulation and using a real-life application that focuses on modelling, the high heating values of proximate analysis was conducted to support the theoretical findings This new method of estimation is recommended for parameter estimation with and without multicollinearity in a linear regression model

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call