Abstract

Ridge regression dealswith collinearity in the homoscedastic linear regression model. When the number of predictors (p) is much larger than the number of observations (n), it gives unique least-square estimators. From both, classical and Bayesian approaches, parameter estimation is a highly demanding computational task, in the first one being an optimization problem and in the second one a high-dimensional integration problem usually faced up through Markov chain Monte Carlo (MCMC). The main drawback of MCMC is the practical impossibility of checking convergence to the posterior distribution, which is commonly very slow due to the large number of regression parameters. Here, a computational algorithm is proposed to obtain posterior estimates of regression parameters, variance components and predictions for the conventional ridge regression model. The algorithm is based on a reparametrization of the model which allows us to obtain the marginal posterior means and variances by integrating out a nuisance parameter whose marginal posterior is defined on the open interval .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call