Abstract

Modeling correlation (and covariance) matrices can be challenging due to the positive-definiteness constraint and potential high-dimensionality. Our approach is to decompose the covariance matrix into the correlation and variance matrices and propose a novel Bayesian framework based on modeling the correlations as products of unit vectors. By specifying a wide range of distributions on a sphere (e.g. the squared-Dirichlet distribution), the proposed approach induces flexible prior distributions for covariance matrices (that go beyond the commonly used inverse-Wishart prior). For modeling real-life spatio-temporal processes with complex dependence structures, we extend our method to dynamic cases and introduce unit-vector Gaussian process priors in order to capture the evolution of correlation among components of a multivariate time series. To handle the intractability of the resulting posterior, we introduce the adaptive Δ-Spherical Hamiltonian Monte Carlo. We demonstrate the validity and flexibility of our proposed framework in a simulation study of periodic processes and an analysis of rat's local field potential activity in a complex sequence memory task.

Highlights

  • Modeling covariance matrices—or more broadly, positive definite (PD) matrices—is one of the most fundamental problems in statistics

  • A restriction for this class of parametric models is that some processes might not be adequately modeled by them. This main contributions of this paper are: (a.) a sphere-product representation of correlation/covariance matrix is introduced to induce flexible priors for correlation/covariance matrices and processes; (b.) a general and flexible framework is proposed for modeling mean, variance, and correlation processes separately; (c.) an efficient algorithm is introduced to infer correlation matrices and processes; (d.) the posterior contraction of modeling covariance functions with Gaussian process prior is studied for the first time

  • Most important is the discovery of in sequence’ (InSeq) vs OutSeq differences before 500ms, which reveal changes in neural activity associated with the complex cognitive process of identifying if events occurred in their expected order

Read more

Summary

Introduction

Modeling covariance matrices—or more broadly, positive definite (PD) matrices—is one of the most fundamental problems in statistics. While our proposed method in this paper is based on the separation strategy (Barnard et al, 2000) and the Cholesky decomposition, the main distinction from the existing methods is that it represents each entry of the correlation matrix as a product of unit vectors This in turn provides a flexible framework for modeling covariance matrices without sacrificing interpretability. A restriction for this class of parametric models is that some processes might not be adequately modeled by them This main contributions of this paper are: (a.) a sphere-product representation of correlation/covariance matrix is introduced to induce flexible priors for correlation/covariance matrices and processes; (b.) a general and flexible framework is proposed for modeling mean, variance, and correlation processes separately; (c.) an efficient algorithm is introduced to infer correlation matrices and processes; (d.) the posterior contraction of modeling covariance (correlation) functions with Gaussian process prior is studied for the first time. This is crucial in modeling spatio-temporal processes with complex structures

Connection to the Inverse-Wishart Prior
More Flexible Priors
Dynamically Modeling the Covariance Matrices
Posterior Contraction Theorem
Posterior Inference
Metropolis-within-Gibbs
Spherical HMC
Adaptive Spherical HMC
Simulation Studies
Normal-inverse-Wishart Problem
Simulated Periodic Processes
Analysis of Local Field Potential Activity
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call