Abstract

Multivariate Bayesian linear regression (MBLR) is a popular statistical tool with many applications in a variety of scientific fields. However, a shortcoming is potential model over-complexity, as the model assumes that all responses depend on the same covariates and that all errors are mutually pairwise correlated. The class of Bayesian seemingly unrelated regression (SUR) models generalizes the class of MBLR models by allowing for response-specific covariate sets. In a recent work it has been proposed to employ Gaussian graphical models for learning sparse SUR (SSUR) models with conditional independencies among the errors. The proposed SSUR model infers undirected edges among the errors, and the proposed Reversible Jump Markov Chain Monte Carlo (RJMCMC) inference algorithm relies on approximations of the marginal likelihoods. In this paper, we propose a new refined SSUR model that replaces the undirected graphs (Gaussian graphical models) by directed acyclic graphs (Gaussian Bayesian networks). Unlike the earlier proposed model, our new model is therefore able to learn some directed edges among the errors. And we derive a RJMCMC algorithm that does not require approximations of the marginal likelihoods. In particular, we present an algorithm for sampling covariance matrices that are coherent with a given directed acyclic graph. The proposed RJMCMC algorithm allows for exact Bayesian model averaging across both: the response-specific covariate sets and the directed acyclic graphs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call