Abstract

Entropy quantification algorithms are becoming a prominent tool for the physiological monitoring of individuals through the effective measurement of irregularity in biological signals. However, to ensure their effective adaptation in monitoring applications, the performance of these algorithms needs to be robust when analysing time-series containing missing and outlier samples, which are common occurrence in physiological monitoring setups such as wearable devices and intensive care units. This paper focuses on augmenting Dispersion Entropy (DisEn) by introducing novel variations of the algorithm for improved performance in such applications. The original algorithm and its variations are tested under different experimental setups that are replicated across heart rate interval, electroencephalogram, and respiratory impedance time-series. Our results indicate that the algorithmic variations of DisEn achieve considerable improvements in performance while our analysis signifies that, in consensus with previous research, outlier samples can have a major impact in the performance of entropy quantification algorithms. Consequently, the presented variations can aid the implementation of DisEn to physiological monitoring applications through the mitigation of the disruptive effect of missing and outlier samples.

Highlights

  • With the advancement of physiological recording technology deployed across a broad spectrum of applications, from wearable devices to intensive care units, increased amounts of data are becoming available for analysis [1,2]

  • The robustness of Approximate Entropy (ApEn), Sample Entropy (SampEn), and Fuzzy Entropy (FuzzyEn) has been tested when analysing time-series containing missing samples, and the results indicate that, while the classification capacity of the algorithms can be preserved under certain conditions, the fluctuations of entropy values can be large, affecting the accuracy of the results extracted for each analysed signal segment [18]

  • Concerning the effect of outliers, ApEn and SampEn have been tested, and the results indicate that outlier samples can disrupt the process of entropy quantification to a much greater extent than missing samples and should be a key consideration when testing the robustness of respective algorithms [20,21]

Read more

Summary

Introduction

With the advancement of physiological recording technology deployed across a broad spectrum of applications, from wearable devices to intensive care units, increased amounts of data are becoming available for analysis [1,2]. While derived information can aid medical decision making, leading to personalised and prompt treatments, the successful implementation of data analysis algorithms is limited by challenges arising from the quality of recorded data due to the increased amount of missing and outlier samples, which are common occurrences due to user movement, loose equipment attachment, and electromagnetic interference [3,4,5]. In the case of wearable devices, low data quality caused by missing and outlier samples can limit the prognostic effectiveness of the algorithms, while in the case of intensive care units it can be life threatening through the phenomenon of “alarm fatigue” [6,7]. Building upon the initial extension of entropy to information theory by Shannon [9], novel variations such as Approximate Entropy (ApEn) [10], Sample Entropy (SampEn) [11], Permutation Entropy (PEn) [12], Fuzzy Entropy (FuzzyEn) [13], and Dispersion Entropy (DisEn) [14] have been implemented as nonlinear indexes aiding disease diagnosis and prognosis

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call