In multiple linear regression models, the explanatory variables should be uncorrelated within each other but this assumption is violated in most of the cases. Generally, ordinary least square (OLS) estimator produces larger variances when explanatory variables are highly multicollinear. So, in this paper, we propose some new ridge parameters under Bayesian perspective relative to different loss functions, using Tierney and Kadane (T-K) approximation technique to overcome the effect of multicollinearity. We conduct the simulation study to compare the performance of the proposed estimators with OLS estimator and ordinary ridge estimator with some available best ridge parameters using mean squared error as the performance evaluation criterion. A real application is also consider to show the superiority of proposed estimators against competitive estimators. Based on the results of simulation and real application, we conclude that Bayesian ridge parameter estimated under general entropy loss function is better as compared to the OLS estimator and ordinary ridge estimator, when explanatory variables are small. This statement is also true for larger explanatory variables with small sample size. While for larger sample sizes and explanatory variables, the ordinary ridge estimator with best ridge parameter gives the better performance as others.
Read full abstract