Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy (information theory)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Relationship to thermodynamic entropy=== {{Main|Entropy in thermodynamics and information theory}} The inspiration for adopting the word ''entropy'' in information theory came from the close resemblance between Shannon's formula and very similar known formulae from [[statistical mechanics]]. In [[statistical thermodynamics]] the most general formula for the thermodynamic [[entropy]] {{math|''S''}} of a [[thermodynamic system]] is the [[Gibbs entropy]] <math display="block">S = - k_\text{B} \sum_i p_i \ln p_i \,,</math> where {{math|''k''<sub>B</sub>}} is the [[Boltzmann constant]], and {{math|''p''<sub>''i''</sub>}} is the probability of a [[Microstate (statistical mechanics)|microstate]]. The [[Entropy (statistical thermodynamics)|Gibbs entropy]] was defined by [[J. Willard Gibbs]] in 1878 after earlier work by [[Ludwig Boltzmann]] (1872).<ref>Compare: Boltzmann, Ludwig (1896, 1898). Vorlesungen über Gastheorie : 2 Volumes – Leipzig 1895/98 UB: O 5262-6. English version: Lectures on gas theory. Translated by Stephen G. Brush (1964) Berkeley: University of California Press; (1995) New York: Dover {{isbn|0-486-68455-5}}</ref> The Gibbs entropy translates over almost unchanged into the world of [[quantum physics]] to give the [[von Neumann entropy]] introduced by [[John von Neumann]] in 1927: <math display="block">S = - k_\text{B} \,{\rm Tr}(\rho \ln \rho) \,,</math> where ρ is the [[density matrix]] of the quantum mechanical system and Tr is the [[Trace (linear algebra)|trace]].<ref>{{Cite book|last=Życzkowski|first=Karol|title=Geometry of Quantum States: An Introduction to Quantum Entanglement |title-link=Geometry of Quantum States |publisher=Cambridge University Press|year=2006|pages=301}}</ref> At an everyday practical level, the links between information entropy and thermodynamic entropy are not evident. Physicists and chemists are apt to be more interested in ''changes'' in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the [[second law of thermodynamics]], rather than an unchanging probability distribution. As the minuteness of the Boltzmann constant {{math|''k''<sub>B</sub>}} indicates, the changes in {{math|''S'' / ''k''<sub>B</sub>}} for even tiny amounts of substances in chemical and physical processes represent amounts of entropy that are extremely large compared to anything in [[data compression]] or [[signal processing]]. In classical thermodynamics, entropy is defined in terms of macroscopic measurements and makes no reference to any probability distribution, which is central to the definition of information entropy. The connection between thermodynamics and what is now known as information theory was first made by Boltzmann and expressed by his [[Boltzmann's entropy formula|equation]]: <math display="block">S=k_\text{B} \ln W,</math> where <math>S</math> is the thermodynamic entropy of a particular macrostate (defined by thermodynamic parameters such as temperature, volume, energy, etc.), {{math|''W''}} is the number of microstates (various combinations of particles in various energy states) that can yield the given macrostate, and {{math|''k''<sub>B</sub>}} is the Boltzmann constant.<ref>{{Cite journal|last1=Sharp|first1=Kim|last2=Matschinsky|first2=Franz|date=2015|title=Translation of Ludwig Boltzmann's Paper "On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium"|journal=Entropy|volume=17|pages=1971–2009|doi=10.3390/e17041971|doi-access=free}}</ref> It is assumed that each microstate is equally likely, so that the probability of a given microstate is {{math|1=''p''<sub>''i''</sub> = 1/''W''}}. When these probabilities are substituted into the above expression for the Gibbs entropy (or equivalently {{math|''k''<sub>B</sub>}} times the Shannon entropy), Boltzmann's equation results. In information theoretic terms, the information entropy of a system is the amount of "missing" information needed to determine a microstate, given the macrostate. In the view of [[Edwin Thompson Jaynes|Jaynes]] (1957),<ref>{{Cite journal|last=Jaynes|first=E. T.|date=1957-05-15|title=Information Theory and Statistical Mechanics|url=https://link.aps.org/doi/10.1103/PhysRev.106.620|journal=Physical Review|volume=106|issue=4|pages=620–630|doi=10.1103/PhysRev.106.620|bibcode=1957PhRv..106..620J|s2cid=17870175 }}</ref> thermodynamic entropy, as explained by [[statistical mechanics]], should be seen as an ''application'' of Shannon's information theory: the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics, with the constant of proportionality being just the Boltzmann constant. Adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states of the system that are consistent with the measurable values of its macroscopic variables, making any complete state description longer. (See article: ''[[maximum entropy thermodynamics]]''). [[Maxwell's demon]] can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as [[Rolf Landauer|Landauer]] (from 1961) and co-workers<ref>{{Cite journal|last=Landauer|first=R.|date=July 1961|title=Irreversibility and Heat Generation in the Computing Process|url=https://ieeexplore.ieee.org/document/5392446|journal=IBM Journal of Research and Development|volume=5|issue=3|pages=183–191|doi=10.1147/rd.53.0183|issn=0018-8646|access-date=15 December 2021|archive-date=15 December 2021|archive-url=https://web.archive.org/web/20211215235046/https://ieeexplore.ieee.org/document/5392446|url-status=live}}</ref> have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total thermodynamic entropy does not decrease (which resolves the paradox). [[Landauer's principle]] imposes a lower bound on the amount of heat a computer must generate to process a given amount of information, though modern computers are far less efficient.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)