Abstract

ABSTRACTLet be the slope parameters in a linear regression model and consider the goal of testing (). A well-known concern is that multicolinearity can inflate the standard error of the least squares estimate of , which in turn can result in relatively low power. The paper examines heteroscedastic methods for dealing with this issue via a ridge regression estimator. A method is found that might substantially increase the probability of identifying a single slope that differs from zero. But due to the bias of the ridge estimator, it cannot reject for more than one value of j. Simulations indicate that the increase in power is a function of the correlations among the dependent variables as well as the nature of the distributions generating the data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call