Abstract
In multiple linear regression models, the explanatory variables should be uncorrelated within each other but this assumption is violated in most of the cases. Generally, ordinary least square (OLS) estimator produces larger variances when explanatory variables are highly multicollinear. So, in this paper, we propose some new ridge parameters under Bayesian perspective relative to different loss functions, using Tierney and Kadane (T-K) approximation technique to overcome the effect of multicollinearity. We conduct the simulation study to compare the performance of the proposed estimators with OLS estimator and ordinary ridge estimator with some available best ridge parameters using mean squared error as the performance evaluation criterion. A real application is also consider to show the superiority of proposed estimators against competitive estimators. Based on the results of simulation and real application, we conclude that Bayesian ridge parameter estimated under general entropy loss function is better as compared to the OLS estimator and ordinary ridge estimator, when explanatory variables are small. This statement is also true for larger explanatory variables with small sample size. While for larger sample sizes and explanatory variables, the ordinary ridge estimator with best ridge parameter gives the better performance as others.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Communications in Statistics - Simulation and Computation
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.