Abstract

Random matrix theory (RMT) is a sophisticated technique to analyze the cross-correlations of multivariate time series, while it suffers from the limitation on characterizing the linear relationships. In this paper, we propose a new mutual-information matrix analysis to study the nonlinear interactions of multivariate time series, including: (i) The N-dimensional mutual information ranging between 0 and 1 can describe the strength of nonlinear interactions. (ii) The eigenvalues of the random mutual-information matrix yield the Marchenko–Pastur distribution, except that the dominant eigenvalue is significantly larger than the other eigenvalues. (iii) The distribution of most eigenvectors components of the random mutual-information matrix subjects to the Gaussian distribution, while the dominant eigenvector components tend to follow the uniform distribution. A large value of the N-dimensional mutual information, and the deviations from the eigenvalues distribution as well as the distribution of the eigenvectors components both imply the presence of interactions among the underlying time series. In the empirical analysis, we design a simulation which reveals the advantages of the mutual-information analysis over the RMT. We also apply the mutual-information matrix analysis to a real-world application that indicates the presence of interactions among the stock time series.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call