Abstract

Healthy systems exhibit complex dynamics on the changing of information embedded in physiologic signals on multiple time scales that can be quantified by employing multiscale entropy (MSE) analysis. Here, we propose a measure of complexity, called entropy of entropy (EoE) analysis. The analysis combines the features of MSE and an alternate measure of information, called superinformation, useful for DNA sequences. In this work, we apply the hybrid analysis to the cardiac interbeat interval time series. We find that the EoE value is significantly higher for the healthy than the pathologic groups. Particularly, short time series of 70 heart beats is sufficient for EoE analysis with an accuracy of 81% and longer series of 500 beats results in an accuracy of 90%. In addition, the EoE versus Shannon entropy plot of heart rate time series exhibits an inverted U relationship with the maximal EoE value appearing in the middle of extreme order and disorder.

Highlights

  • Biological systems produce and use information from both their internal and external environments to adapt and survive [1]

  • All three curves of the healthy, the congestive heart failure (CHF), and the atrial fibrillation (AF) monotonically increase with small τ

  • multiscale entropy (MSE) has been widely applied in analyzing many physiologic signals, such as heart rate [2,3,24], electroencephalography (EEG) signal [25,26,27], blood oxygen level-dependent signals in functional magnetic resonance imaging [28], diffusion tensor imaging (DTI) of the brain [29], neuronal spiking [30], center of pressure signals in balance [31,32] and intracranial pressure signal [33]

Read more

Summary

Introduction

Biological systems produce and use information from both their internal and external environments to adapt and survive [1]. Complexity of a biological system, in terms of its output (e.g., physiologic signals), is considered a reflection of its ability to adapt and function in an ever-changing environment. As there are complex nonlinear interactions that regulate a healthy physiologic signal, the signal is, constantly changing and hard-to-predict [1], and the resulting complex behavior is observed to be different from either a highly random or a very regular one. Known as information entropy proposed by Shannon for communication in 1948 [11], measures the average information of all specific events with their probabilities in the past. It does not consider the relations between distinct events in a time series

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call