Cargando…
Rényi Entropy in Statistical Mechanics
Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different attempts to generalise statistical mechanics, although mostly skipping the physical arguments behind this entropy an...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9407421/ https://www.ncbi.nlm.nih.gov/pubmed/36010744 http://dx.doi.org/10.3390/e24081080 |
Sumario: | Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different attempts to generalise statistical mechanics, although mostly skipping the physical arguments behind this entropy and instead tending to introduce it artificially. However, as we will show, modifications to the theory of statistical mechanics are needless to see how Rényi entropy automatically arises as the average rate of change of free energy over an ensemble at different temperatures. Moreover, this notion is extended by considering distributions for isospectral, non-isothermal processes, resulting in relative versions of free energy, in which the Kullback–Leibler divergence or the relative version of Rényi entropy appear within the structure of the corrections to free energy. These generalisations of free energy recover the ordinary thermodynamic potential whenever isothermal processes are considered. |
---|