Abstract

Abstract Discrete time signals carry information about systems and their internal functional mechanisms which characterize their complexity. Complexity measures are strongly related to information content and evaluations have been made on various signals in many ways in last few years. This paper uses information theory estimates of complexity as different types of entropies in order to estimate the complexity of various time discrete synthesized signals. Results show that this kind of indices can be a useful tool in diagnostic, fault detection and further development.

Highlights

  • This paper introduces different entropies as information theorybased parameters in order to evaluate the complexity of discrete signals

  • At first the Shannon entropies were computed for the test signals, in this case there are no significant differences as it can be observed on figure 5

  • This paper focuses on information theory-based complexity indices which are different types of entropies

Read more

Summary

Introduction

Quantifying the complexity of a discrete signal is an important task because usually it is strongly related to the information content of the signal. This paper introduces different entropies as information theorybased parameters in order to evaluate the complexity of discrete signals. Both of clear and noisy signals are analyzed, the evaluations are performed on synthesized signals. This paper is organized as follows: the second chapter offers a review of the most used entropy metrics and presents their characteristics through the corresponding implementing algorithms. In chapter three are presented the test signals and the proposed procedures, chapter four brings the experimental results. Concluding remarks and possible future work ideas are presented

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call