Abstract

The linear regression model explores the relationship between a response variable and one or more independent variables. The ordinary least squared estimator is usually adopted to estimate the parameters of the model when the independent variables are uncorrelated. However, the estimator performance dropped when the independent variables are correlated- a situation known as multicollinearity. This paper developed a new biased regression estimator based on a one-parameter and two-parameter estimators as an alternative to the ordinary least squares estimator when the independent variables are linearly dependent. Theoretical comparison, simulation and real-life data were carried out. The results revealed that the new estimator dominates other estimators considered in this study.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call