Abstract

Because of its flexibility and multiple meanings, the concept of information entropy in its continuous or discrete form has proven to be very relevant in numerous scientific branches. For example, it is used as a measure of disorder in thermodynamics, as a measure of uncertainty in statistical mechanics as well as in classical and quantum information science, as a measure of diversity in ecological structures and as a criterion for the classification of races and species in population dynamics. Orthogonal polynomials are a useful tool in solving and interpreting differential equations. Lately, this subject has been intensively studied in many areas. For example, in statistics, by using orthogonal polynomials to fit the desired model to the data, we are able to eliminate collinearity and to seek the same information as simple polynomials. In this paper, we consider the Tsallis, Kaniadakis and Varma entropies of Chebyshev polynomials of the first kind and obtain asymptotic expansions. In the particular case of quadratic entropies, there are given concrete computations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call