Abstract

In this paper, we propose new Bayesian hierarchical representations of lasso, adaptive lasso and elastic net quantile regression models. We explore these representations by observing that the lasso penalty function corresponds to a scale mixture of truncated normal distribution (with exponential mixing densities). We consider fully Bayesian treatments that lead to new Gibbs sampler methods with tractable full conditional posteriors. The new methods are then illustrated with both simulated and real data. Results show that the new methods perform very well under a variety of simulations, such as the presence of a moderately large number of predictors, collinearity and heterogeneity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call