ABSTRACTPrecision matrices are crucial in many fields such as social networks, neuroscience and economics, representing the edge structure of Gaussian graphical models (GGMs), where a zero in an off‐diagonal position of the precision matrix indicates conditional independence between nodes. In high‐dimensional settings where the dimension of the precision matrix exceeds the sample size and the matrix is sparse, methods like graphical Lasso, graphical SCAD and CLIME are popular for estimating GGMs. While frequentist methods are well‐studied, Bayesian approaches for (unstructured) sparse precision matrices are less explored. The graphical horseshoe estimate, applying the global‐local horseshoe prior, shows superior empirical performance, but theoretical work for sparse precision matrix estimations using shrinkage priors is limited. This paper addresses these gaps by providing concentration results for the tempered posterior with the fully specified horseshoe prior in high‐dimensional settings. Moreover, we also provide novel theoretical results for model misspecification, offering a general oracle inequality for the posterior. A concise set of simulations is performed to validate our theoretical findings.
Read full abstract