Abstract

Regression models explore the relationship between the response variable and one or more explanatory variables. It becomes practically challenging in real-life applications to model this relationship when the explanatory variables are linearly dependent. Conventionally, to avoid this issue, several shrinkage estimators are proposed. The ridge estimator is one of the most popular methods of estimation when there is linear dependency among the explanatory variables. In this study, we proposed a jackknifed version of the ridge estimator for the Bell regression model by jackknifing the ridge estimator to reduce the biasedness. The simulation and the application results revealed that the proposed estimator enhance the performance of the ridge estimator.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call