Abstract

The presence of multicollinearity will significantly lead to inconsistent parameter estimates in regression modeling. The common procedure in regression analysis that is Ordinarily Least Squares (OLS) is not robust to multicollinearity problem and will result in inaccurate model. To solve this problem, a number of methods are developed in the literatures and the most common method is ridge regression. Although there are many studies propose variety method to overcome multicolinearity problem in regression analysis, this study proposes the simplest model of ridge regression which is based on linear combinations of the coefficient of the least squares regression of independent variables to determine the value of k (ridge estimator in ridge regression model). The performance of the proposed method is investigated and compared to OLS and some recent existing methods. Thus, simulation studies based on Monte Carlo simulation study are considered. The result of this study is able to produce similar findings as in existing method and outperform OLS in the existence of multicollinearity in the regression modeling.

Highlights

  • Ordinary Least Squares (OLS) is the Best Linear Unbiased Estimator (BLUE) in investigating the relationship between explanatory and response variables in the regression modeling

  • This critical issue might happen if the analysis contains large sets of data with several numbers of explanatory variables and this will affect to the existence of multicollinearity problem

  • Due to interest in choosing the most appropriate k in ridge regression method, this study proposes another technique with k can be formed as a linear combination of coefficients of determination of explanatory variables

Read more

Summary

Introduction

Ordinary Least Squares (OLS) is the Best Linear Unbiased Estimator (BLUE) in investigating the relationship between explanatory and response variables in the regression modeling. The results of parameter estimates and inference under OLS procedure will be insignificant and unreliable. Due to such problem, there are quite number methods of estimations to overcome multicollinearity problem in regression analysis. [1, 2] are the first to introduce ridge regression method by adding small positive quantities (denoted by letter k in many studies (see [3]) to the diagonal of the matrix XT X where XT X is matrix of explanatory variables) and it is shown can minimize the biased estimates and mean squared error (MSE) of the model. It is found that most of the models outperform OLS and concludes that the generalized ridge regression is the best model based on the smallest MSE value

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call