Abstract

In the framework of superstatistics it has been shown that one can calculate the entropy of nonextensive statistical mechanics. We follow a similar procedure; we assume a Γ(χ(2)) distribution depending on β that also depends on a parameter p(l). From it we calculate the Boltzmann factor and show that it is possible to obtain the information entropy S=k∑(l=1)(Ω)s(p(l)), where s(p(l))=1-p(l)(p(l)). By maximizing this information measure, p(l) is calculated as function of βE(l) and, at this stage of the procedure, p(l) can be identified with the probability distribution. We show the validity of the saddle-point approximation and we also briefly discuss the generalization of one of the four Khinchin axioms. The modified axioms are then in accordance with the proposed entropy. As further possibilities, we also propose other entropies depending on p(l) that resemble the Kaniakadis and two possible Sharma-Mittal entropies. By expanding in series all entropies in this work we have as a first term the Shannon entropy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call