Abstract

To quantify the irregularity of data, there are a number of entropy measures each with its own advantages and disadvantages. In this pilot study, a new concept, namely ensemble entropy, is introduced and used to generate more stable and low bias signal patterns for entropy estimation. We propose ensemble versions of sample entropy (SampEn), permutation entropy, dispersion entropy (DispEn), fluctuation DispEn (FDispEn) based on the combination of different parameters initialization for an original entropy method. Also, ensemble Shannon and conditional entropy methods based on the entropy values obtained by different entropy algorithms are developed in this study. We applied the techniques to different synthetic and three biomedical datasets to investigate the behavior of the ensemble methods on the changes in the data dynamics. The results suggest that ensemble approaches are able to distinguish different kinds of noises and the degrees of randomness in our generated MIX process. Ensemble SampEn, unlike SampEn, does not result in undefined values for short signals. Ensemble DispEn needs a smaller number of samples for distinguishing different kinds of noise. The majority of ensemble methods result in larger differences between younger and older subjects using their RR intervals as well as healthy young vs. elderly children using their walking stride interval data based on Hedges’ g effect size. The ensemble algorithms lead to more stable results (lower coefficients of variations) for the synthetic data (different kinds of noises and mixed processes) and discriminated different types of physiological signals better than their corresponding original entropy approaches. The Matlab code used in this paper is available at https://github.com/HamedAzami/.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call