Abstract

In this paper, an information-based measure of association between time series, called information-based correlation coefficient (ICC), is introduced to potentially overcome some of the problems related to Pearson's correlation coefficient and other commonly used coefficients. The ICC, besides its simple definition, seems to have other interesting advantages over traditional correlation coefficients, such as a more natural interpretation in terms of the reciprocal informativeness between two series, higher reliability for independent white noise, Student's t, and AR(1) processes, higher reliability in the presence of spurious correlation and outliers, and a well-developed mathematical theory based on entropy. The ICC can also detect a number of nonlinear relationships, although it may not be equitable and general. Moreover, the ICC can be computed also for ordinal data, and it offers higher reliability for independent ordinal-valued time series. Monte Carlo simulations, applications to real data, properties of the estimator and asymptotics under independence are also discussed in this paper. In particular, the paper includes applications to signal processing, chaotic time series, macroeconomic and weather data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call