Abstract

The structure of time series and letter sequences is investigated using the concepts of entropy and complexity. First, conditional entropy and mutual information are introduced and several generalizations are discussed. Further, several measures of complexity are introduced and discussed. The capability of these concepts to describe the structure of time series and letter sequences generated by nonlinear maps, data series from meteorology, astrophysics, cardiology, cognitive psychology, and finance is investigated. The relation between the complexity and the predictability of information strings is discussed. The relation between local order and the predictability of time series is investigated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call