Cargando…
An Informational Theoretical Approach to the Entropy of Liquids and Solutions
It is well known that the statistical mechanical theory of liquids has been lagging far behind the theory of either gases or solids, See for examples: Ben-Naim (2006), Fisher (1964), Guggenheim (1952) Hansen and McDonald (1976), Hill (1956), Temperley, Rowlinson and Rushbrooke (1968), O’Connell (197...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7513034/ https://www.ncbi.nlm.nih.gov/pubmed/33265604 http://dx.doi.org/10.3390/e20070514 |
Sumario: | It is well known that the statistical mechanical theory of liquids has been lagging far behind the theory of either gases or solids, See for examples: Ben-Naim (2006), Fisher (1964), Guggenheim (1952) Hansen and McDonald (1976), Hill (1956), Temperley, Rowlinson and Rushbrooke (1968), O’Connell (1971). Information theory was recently used to derive and interpret the entropy of an ideal gas of simple particles (i.e., non-interacting and structure-less particles). Starting with Shannon’s measure of information (SMI), one can derive the entropy function of an ideal gas, the same function as derived by Sackur (1911) and Tetrode (1912). The new deviation of the same entropy function, based on SMI, has several advantages, as listed in Ben-Naim (2008, 2017). Here we mention two: First, it provides a simple interpretation of the various terms in this entropy function. Second, and more important for our purpose, this derivation may be extended to any system of interacting particles including liquids and solutions. The main idea is that once one adds intermolecular interactions between the particles, one also adds correlations between the particles. These correlations may be cast in terms of mutual information (MI). Hence, we can start with the informational theoretical interpretation of the entropy of an ideal gas. Then, we add correction due to correlations in the form of MI between the locations of the particles. This process preserves the interpretation of the entropy of liquids and solutions in terms of a measure of information (or as an average uncertainty about the locations of the particles). It is well known that the entropy of liquids, any liquids for that matter, is lower than the entropy of a gas. Traditionally, this fact is interpreted in terms of order-disorder. The lower entropy of the liquid is interpreted in terms of higher degree of order compared with that of the gas. However, unlike the transition from a solid to either a liquid, or to a gaseous phase where the order-disorder interpretation works well, the same interpretation would not work for the liquid-gas transition. It is hard, if not impossible, to argue that the liquid phase is more “ordered” than the gaseous phase. In this article, we interpret the lower entropy of liquids in terms of SMI. One outstanding liquid known to be a structured liquid, is water, according to Ben-Naim (2009, 2011). In addition, heavy water, as well as aqueous solutions of simple solutes such as argon or methane, will be discussed in this article. |
---|