Abstract

We consider three Bayesian penalized regression models and show that the respective deterministic scan Gibbs samplers are geometrically ergodic regardless of the dimension of the regression problem. We prove geometric ergodicity of the Gibbs samplers for the Bayesian fused lasso, the Bayesian group lasso, and the Bayesian sparse group lasso. Geometric ergodicity along with a moment condition results in the existence of a Markov chain central limit theorem for Monte Carlo averages and ensures reliable output analysis. Our results of geometric ergodicity allow us to also provide default starting values for the Gibbs samplers.

Highlights

  • Let y ∈ Rn be the observed realization of the response Y, X be the n × p model matrix, and β ∈ Rp be the regression coefficient vector

  • We show that the Markov chain Monte Carlo (MCMC) samplers used in the three models converge to their respective stationary distribution at a geometric rate

  • We show that all three Gibbs samplers converge to their respective stationary distribution at a geometric rate under reasonable conditions

Read more

Summary

Introduction

Let y ∈ Rn be the observed realization of the response Y , X be the n × p model matrix, and β ∈ Rp be the regression coefficient vector. We show that the MCMC samplers used in the three models converge to their respective stationary distribution at a geometric rate. We show that all three Gibbs samplers converge to their respective stationary distribution at a geometric rate under reasonable conditions. We only require the number of observations, n, to be larger than three and require no assumptions on the number of covariates, p or the model matrix X This geometric rate of convergence allows for reliable estimation of posterior quantities in the following way. If the deterministic scan Gibbs sampler is geometrically ergodic and g(β, η, σ2) 2+δ f (β, η, σ2 | y)dβ dη dσ2 < ∞ , a Markov chain CLT holds as below:. Johnson and Jones (2015) established geometric ergodicity of a four variable random scan Gibbs sampler for a hierarchical random effects model. This will lead us to default starting values for the three Gibbs sampler

Bayesian Fused Lasso
Gibbs Sampler for the Bayesian Fused Lasso
Bayesian Group Lasso
Gibbs Sampler for Bayesian Group Lasso
Bayesian Sparse Group Lasso
Gibbs Sampler for Bayesian Sparse Group Lasso
Discussion
A Preliminaries
Useful Lemmas
Propriety of the Prior
Validity of the Prior
Drift Condition
Minorization
Starting Values
D Proof of Geometric Ergodicity in the Bayesian Group Lasso
Minorization Condition
E Proof of Geometric Ergodicity in the Bayesian Sparse Group Lasso
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call