Abstract

In this paper, we estimate the Shannon entropy S(f)=−E[log(f(x))] of a one-sided linear process with probability density function f(x). We employ the integral estimator Sn(f), which utilizes the standard kernel density estimator fn(x) of f(x). We show that Sn(f) converges to S(f) almost surely and in Ł2 under reasonable conditions.

Highlights

  • Let f (x) be the common probability density function of a sequence {Xn}∞n=1 of identically distributed observations

  • Entropy is widely applied in the fields of information theory, statistical classification, pattern recognition and so on, since it is a measure of the amount of uncertainty present in a probability distribution

  • Plugging in kernel density estimators for the arguments of H and integrating only over the symmetric interval [−kn, kn], which is determined by a sequence {kn}∞n=1 of a certain order, they provided a result for the estimation of Shannon entropy using the estimator that Beirlant et al (1997) refer to as the integral estimator

Read more

Summary

Introduction

Let f (x) be the common probability density function of a sequence {Xn}∞n=1 of identically distributed observations. Plugging in kernel density estimators (see their paper and references therein) for the arguments of H and integrating only over the symmetric interval [−kn, kn], which is determined by a sequence {kn}∞n=1 of a certain order, they provided a result for the estimation of Shannon entropy using the estimator that Beirlant et al (1997) refer to as the integral estimator. Utilizing the Fourier transform along with the projection method, they demonstrate that the kernel entropy estimator satisfies a central limit theorem for short memory linear processes.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call