Abstract

When adopting the Ordinary Least Squares (OLS) method to compute regression coefficients, the results become unreliable when two or more predictor variables are linearly related to one another. The confidence interval of the estimates becomes longer as a result of the increased variance of the OLS estimator, which also causes test procedures to have the potential to generate deceptive results. Additionally, it is difficult to determine the marginal contribution of the associated predictors since the estimates depend on the other predictor variables that are included in the model. This makes the determination of the marginal contribution difficult. Ridge Regression (RR) is a popular alternative to consider in this scenario; however, doing so impairs the standard approach for statistical testing. The Raise Method (RM) is a technique that was developed to combat multicollinearity while maintaining statistical inference. In this work, we offer a novel approach for determining the raise parameter, because the traditional one is a function of actual coefficients, which limits the use of Raise Method in real-world circumstances. Using simulations, the suggested method was compared to Ordinary Least Squares and Ridge Regression in terms of its capacity to forecast, stability of its coefficients, and probability of obtaining unacceptable coefficients at different levels of sample size, linear dependence, and residual variance. According to the findings, the technique that we designed turns out to be quite effective. Finally, a practical application is discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call