Abstract
In the framework of superstatistics it has been shown that one can calculate the entropy of nonextensive statistical mechanics. We follow a similar procedure; we assume a Γ(χ(2)) distribution depending on β that also depends on a parameter p(l). From it we calculate the Boltzmann factor and show that it is possible to obtain the information entropy S=k∑(l=1)(Ω)s(p(l)), where s(p(l))=1-p(l)(p(l)). By maximizing this information measure, p(l) is calculated as function of βE(l) and, at this stage of the procedure, p(l) can be identified with the probability distribution. We show the validity of the saddle-point approximation and we also briefly discuss the generalization of one of the four Khinchin axioms. The modified axioms are then in accordance with the proposed entropy. As further possibilities, we also propose other entropies depending on p(l) that resemble the Kaniakadis and two possible Sharma-Mittal entropies. By expanding in series all entropies in this work we have as a first term the Shannon entropy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.