Abstract
We introduce a novel procedure for obtaining cross-validated predictive estimates for Bayesian hierarchical regression models (BHRMs). BHRMs are popular for modeling complex dependence structures (e.g., Gaussian processes and Gaussian Markov random fields) but can be computationally expensive to run. Cross-validation (CV) is, therefore, not a common practice to evaluate the predictive performance of BHRMs. Our method circumvents the need to rerun computationally costly estimation methods for each cross-validation fold and makes CV more feasible for large BHRMs. We shift the CV problem from probability-based sampling to a familiar and straightforward optimization problem by conditioning on the variance-covariance parameters. Our approximation applies to leave-one-out CV and leave-one-cluster-out CV, the latter of which is more appropriate for models with complex dependencies. In many cases, this produces estimates equivalent to full CV. We provide theoretical results, demonstrate the efficacy of our method on publicly available data and in simulations, and compare the model performance with several competing methods for CV approximation. Code and other supplementary materials available online.
Submitted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.