Abstract

Most common regression models for analyzing binary random variables are logistic and probit regression models. However it is well known that the estimates of regression coefficients for these models are not robust to outliers [26]. The robit regression model [1, 16] is a robust alternative to the probit and logistic models. The robit model is obtained by replacing the normal (logistic) distribution underlying the probit (logistic) regression model with the Student’s $t-$distribution. We consider a Bayesian analysis of binary data with the robit link function. We construct a data augmentation (DA) algorithm that can be used to explore the corresponding posterior distribution. Following [10] we further improve the DA algorithm by adding a simple extra step to each iteration. Though the two algorithms are basically equivalent in terms of computational complexity, the second algorithm is theoretically more efficient than the DA algorithm. Moreover, we analyze the convergence rates of these Markov chain Monte Carlo (MCMC) algorithms. We prove that, under certain conditions, both algorithms converge at a geometric rate. The geometric convergence rate has important theoretical and practical ramifications. Indeed, the geometric ergodicity guarantees that the ergodic averages used to approximate posterior expectations satisfy central limit theorems, which in turn allows for the construction of asymptotically valid standard errors. These standard errors can be used to choose an appropriate (Markov chain) Monte Carlo sample size and allow one to use the MCMC algorithms developed in this paper with the same level of confidence that one would have using classical (iid) Monte Carlo. The results are illustrated using a simple numerical example.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call