Abstract

Regression techniques are generally used to predict a response variable using one or more predictor variables. In many fields of study, the regressors can be highly intercorrelated, which leads to the problem of multicollinearity. Consequently, the ordinary least squares estimates become inconsistent and lead to wrong inferences. To handle the problem, machine learning techniques particularly, the ridge regression approach, are commonly used. In this paper, we revisit the problem of estimating the ridge parameter “ ${k}$ ” by proposing some new estimators using the Jackknife method and compare them with some existing estimators. The performance of the proposed estimators compared to the existing ones is evaluated using extensive Monte Carlo simulations as well as two real data sets. The results suggested that the proposed estimators outperform the existing estimators.

Highlights

  • The "primary goal of the regression analysis is to predict the response variable with the help of one or more predictor variables

  • One can see that the existing as well as the proposed ridge regression estimators have the smallest mean squared error (MSE) compared to the ordinary least squares (OLS) estimator

  • Ridge regression is a well known technique used in the presence of multicollinearity in the data

Read more

Summary

INTRODUCTION

The "primary goal of the regression analysis is to predict the response variable with the help of one or more predictor variables. The OLS estimators have very large standard errors and lead to wrong inferences. To cope with this problem, machine learning techniques are widely used. If the matrix X X is ill-conditioned, it indicates that there exists a multicollinearity problem. In such cases, the OLS estimators are inconsistent and have large variances. Ridge regression was first proposed by [1] to solve the multicollinearity problem. As the ridge estimator is heavily dependent on the unknown value of k, "the optimum value for k that can produce the best results to some extent is still an open problem in the literature.

SOME EXISTING AND NEW ESTIMATORS
HOERL AND KENNARD ESTIMATOR
SHUKUR ESTIMATOR
ALKHAMISI ESTIMATOR
PROPOSED ESTIMATORS
JACKKNIFE ALGORITHM
SIMULATION RESULTS AND DISCUSSION
REAL DATA APPLICATION
CASE STUDY 2
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call