Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Approaches to understanding entropy == As a fundamental aspect of thermodynamics and physics, several different approaches to entropy beyond that of Clausius and Boltzmann are valid. === Standard textbook definitions === The following is a list of additional definitions of entropy from a collection of textbooks: * a measure of [[energy dispersal]] at a specific temperature. * a measure of disorder in the universe or of the availability of the energy in a system to do work.<ref>{{cite book|last1=Gribbin|first1=John|editor1-last=Gribbin|editor1-first=Mary|title=Q is for quantum : an encyclopedia of particle physics|date=1999|publisher=Free Press|location=New York|isbn=978-0-684-85578-3|url=https://archive.org/details/qisforquantumenc00grib}}</ref> * a measure of a system's [[thermal energy]] per unit temperature that is unavailable for doing useful [[work (thermodynamics)|work]].<ref>{{cite web|title=Entropy: Definition and Equation|url=https://www.britannica.com/EBchecked/topic/189035/entropy|website=Encyclopædia Britannica|access-date=22 May 2016}}</ref> In Boltzmann's analysis in terms of constituent particles, entropy is a measure of the number of possible microscopic states (or microstates) of a system in thermodynamic equilibrium. === Order and disorder === {{Main|Entropy (order and disorder)}} Entropy is often loosely associated with the amount of [[wikt:order|order]] or [[Randomness|disorder]], or of [[Chaos theory|chaos]], in a [[thermodynamic system]]. The traditional qualitative description of entropy is that it refers to changes in the state of the system and is a measure of "molecular disorder" and the amount of wasted energy in a dynamical energy transformation from one state or form to another. In this direction, several recent authors have derived exact entropy formulas to account for and measure disorder and order in atomic and molecular assemblies.<ref name="Brooks">{{cite book|last1=Brooks|first1=Daniel R.|last2=Wiley|first2=E. O.|title=Evolution as entropy : toward a unified theory of biology|date=1988|publisher=University of Chicago Press|location=Chicago [etc.]|isbn=978-0-226-07574-7|edition=2nd}}</ref><ref name="Landsberg-A">{{cite journal | last1 = Landsberg | first1 = P.T. | year = 1984 | title = Is Equilibrium always an Entropy Maximum? | journal = J. Stat. Physics | volume = 35 | issue = 1–2| pages = 159–169 | doi=10.1007/bf01017372|bibcode = 1984JSP....35..159L | s2cid = 122424225 }}</ref><ref name="Landsberg-B">{{cite journal | last1 = Landsberg | first1 = P.T. | year = 1984 | title = Can Entropy and "Order" Increase Together? | journal = Physics Letters | volume = 102A | issue = 4| pages = 171–173 | doi=10.1016/0375-9601(84)90934-4|bibcode = 1984PhLA..102..171L }}</ref> One of the simpler entropy order/disorder formulas is that derived in 1984 by thermodynamic physicist Peter Landsberg, based on a combination of [[thermodynamics]] and [[information theory]] arguments. He argues that when constraints operate on a system, such that it is prevented from entering one or more of its possible or permitted states, as contrasted with its forbidden states, the measure of the total amount of "disorder" and "order" in the system are each given by:<ref name="Brooks" />{{Reference page|page=69}}<ref name="Landsberg-A" /><ref name="Landsberg-B" /> <math display="block">\mathsf{Disorder} = \frac{C_\mathsf{D}}{C_\mathsf{I}}</math><math display="block">\mathsf{Order} = 1 - \frac{C_\mathsf{O}}{C_\mathsf{I}}</math> Here, <math display="inline">C_\mathsf{D}</math> is the "disorder" capacity of the system, which is the entropy of the parts contained in the permitted ensemble, <math display="inline">C_\mathsf{I}</math> is the "information" capacity of the system, an expression similar to Shannon's [[channel capacity]], and <math display="inline">C_\mathsf{O}</math> is the "order" capacity of the system.<ref name="Brooks" /> === Energy dispersal === {{Main|Entropy (energy dispersal)}} [[File:Ultra slow-motion video of glass tea cup smashed on concrete floor.webm|thumb|thumbtime=0:04|Slow motion video of a glass cup smashing on a concrete floor. In the very short time period of the breaking process, the entropy of the mass making up the glass cup rises sharply, as the matter and energy of the glass disperse.]] The concept of entropy can be described qualitatively as a measure of energy dispersal at a specific temperature.<ref>{{cite web|last1=Lambert |first1=Frank L. |title=A Student's Approach to the Second Law and Entropy |url=http://franklambert.net/entropysite.com/students_approach.html }}</ref> Similar terms have been in use from early in the history of [[classical thermodynamics]], and with the development of [[statistical thermodynamics]] and [[quantum mechanics|quantum theory]], entropy changes have been described in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantised energy levels. Ambiguities in the terms ''disorder'' and ''chaos'', which usually have meanings directly opposed to equilibrium, contribute to widespread confusion and hamper comprehension of entropy for most students.<ref>{{cite journal|last1=Watson|first1=J.R.|last2=Carson|first2=E.M.|title=Undergraduate students' understandings of entropy and Gibbs free energy.|journal=University Chemistry Education|date=May 2002|volume=6|issue=1|page=4|url=http://www.rsc.org/images/Vol_6_No1_tcm18-7042.pdf|issn=1369-5614}}</ref> As the [[second law of thermodynamics]] shows, in an [[isolated system]] internal portions at different temperatures tend to adjust to a single uniform temperature and thus produce equilibrium. A recently developed educational approach avoids ambiguous terms and describes such spreading out of energy as dispersal, which leads to loss of the differentials required for work even though the total energy remains constant in accordance with the [[first law of thermodynamics]]<ref>{{cite journal|last1=Lambert|first1=Frank L.|s2cid=97102995|title=Disorder – A Cracked Crutch for Supporting Entropy Discussions|url=http://franklambert.net/entropysite.com/cracked_crutch.html|journal=Journal of Chemical Education|date=February 2002|volume=79|issue=2|pages=187|doi=10.1021/ed079p187|bibcode=2002JChEd..79..187L}}</ref> (compare discussion in next section). Physical chemist [[Peter Atkins]], in his textbook ''Physical Chemistry'', introduces entropy with the statement that "spontaneous changes are always accompanied by a dispersal of energy or matter and often both".<ref name="AtkinsPaula2019">{{cite book |author1=Peter Atkins |author2=Julio de Paula |author3=James Keeler |title=Atkins' Physical Chemistry 11e: Volume 3: Molecular Thermodynamics and Kinetics |url=https://books.google.com/books?id=0UKjDwAAQBAJ&pg=PA89 |year=2019 |publisher=Oxford University Press |isbn=978-0-19-882336-0 |page=89}}</ref> === Relating entropy to energy ''usefulness'' === It is possible (in a thermal context) to regard lower entropy as a measure of the ''effectiveness'' or ''usefulness'' of a particular quantity of energy.<ref>{{Cite journal|title=Book Review of 'A Science Miscellany'|journal=Khaleej Times|publisher=UAE: Galadari Press|date=23 February 1993|page=xi|author=Sandra Saary |url=http://dlmcn.com/entropy2.html}}</ref> Energy supplied at a higher temperature (i.e. with low entropy) tends to be more useful than the same amount of energy available at a lower temperature. Mixing a hot parcel of a fluid with a cold one produces a parcel of intermediate temperature, in which the overall increase in entropy represents a "loss" that can never be replaced. As the entropy of the universe is steadily increasing, its total energy is becoming less useful. Eventually, this is theorised to lead to the [[heat death of the universe]].<ref>{{Cite book |title =Energy and Empire: A Biographical Study of Lord Kelvin | last1= Smith |first1=Crosbie |last2=Wise |first2=M. Norton |publisher=Cambridge University Press |year=1989 |isbn=978-0-521-26173-9 |pages= 500–501 |author-link2 = M. Norton Wise}}</ref> === Entropy and adiabatic accessibility === A definition of entropy based entirely on the relation of [[adiabatic accessibility]] between equilibrium states was given by [[Elliott H. Lieb|E. H. Lieb]] and [[Jakob Yngvason|J. Yngvason]] in 1999.<ref>{{cite journal |last1=Lieb |first1=Elliott H. |last2=Yngvason |first2=Jakob |title=The physics and mathematics of the second law of thermodynamics |journal=Physics Reports |date=March 1999 |volume=310 |issue=1 |pages=1–96 |doi=10.1016/S0370-1573(98)00082-9 |arxiv=cond-mat/9708200 |bibcode=1999PhR...310....1L |s2cid=119620408}}</ref> This approach has several predecessors, including the pioneering work of [[Constantin Carathéodory]] from 1909<ref>{{cite journal |last1=Carathéodory |first1=C. |title=Untersuchungen über die Grundlagen der Thermodynamik |journal=Mathematische Annalen |date=September 1909 |volume=67 |issue=3 |pages=355–386 |doi=10.1007/BF01450409 |s2cid=118230148 |url=https://zenodo.org/record/1428268 |language=de}}</ref> and the monograph by R. Giles.<ref>{{cite book |author=R. Giles |title=Mathematical Foundations of Thermodynamics: International Series of Monographs on Pure and Applied Mathematics |url=https://books.google.com/books?id=oK03BQAAQBAJ |date=2016 |publisher=Elsevier Science |isbn=978-1-4831-8491-3}}</ref> In the setting of Lieb and Yngvason, one starts by picking, for a unit amount of the substance under consideration, two reference states <math display="inline">X_0</math> and <math display="inline">X_1</math> such that the latter is adiabatically accessible from the former but not conversely. Defining the entropies of the reference states to be 0 and 1 respectively, the entropy of a state <math display="inline">X</math> is defined as the largest number <math display="inline">\lambda</math> such that <math display="inline">X</math> is adiabatically accessible from a composite state consisting of an amount <math display="inline">\lambda</math> in the state <math display="inline">X_1</math> and a complementary amount, <math display="inline">(1 - \lambda)</math>, in the state <math display="inline">X_0</math>. A simple but important result within this setting is that entropy is uniquely determined, apart from a choice of unit and an additive constant for each chemical element, by the following properties: it is monotonic with respect to the relation of adiabatic accessibility, additive on composite systems, and extensive under scaling. === Entropy in quantum mechanics === {{Main|von Neumann entropy}} In [[quantum statistical mechanics]], the concept of entropy was developed by [[John von Neumann]] and is generally referred to as "[[von Neumann entropy]]":<math display="block">S = - k_\mathsf{B}\ \mathrm{tr}{\left( \hat{\rho} \times \ln{\hat{\rho}} \right)}</math>where <math display="inline">\hat{\rho}</math> is the [[density matrix]], <math display="inline">\mathrm{tr}</math> is the [[Trace class|trace operator]] and <math display="inline">k_\mathsf{B}</math> is the [[Boltzmann constant]]. This upholds the [[correspondence principle]], because in the [[classical limit]], when the phases between the basis states are purely random, this expression is equivalent to the familiar classical definition of entropy for states with classical probabilities <math display="inline">p_i</math>:<math display="block">S = - k_\mathsf{B} \sum_i{p_i \ln{p_i}}</math>i.e. in such a basis the density matrix is diagonal. Von Neumann established a rigorous mathematical framework for quantum mechanics with his work {{lang|de|Mathematische Grundlagen der Quantenmechanik}}. He provided in this work a theory of measurement, where the usual notion of [[wave function collapse]] is described as an irreversible process (the so-called von Neumann or [[projective measurement]]). Using this concept, in conjunction with the [[density matrix]] he extended the classical concept of entropy into the quantum domain. === Information theory === {{Main|Entropy (information theory)|Entropy in thermodynamics and information theory|Entropic uncertainty}} {{quote box |align=right |width=30em |quote=I thought of calling it "information", but the word was overly used, so I decided to call it "uncertainty". [...] Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. |source=Conversation between [[Claude Shannon]] and [[John von Neumann]] regarding what name to give to the [[attenuation]] in phone-line signals<ref>{{cite journal | last1 = Tribus | first1 = M. | last2 = McIrvine | first2 = E. C. | year = 1971 | title = Energy and information | url=https://www.jstor.org/stable/24923125 | journal = Scientific American | volume = 224 | issue = 3 | pages = 178–184 | jstor = 24923125 }}</ref>}} When viewed in terms of [[information theory]], the entropy state function is the amount of information in the system that is needed to fully specify the microstate of the system. [[Entropy (information theory)|Entropy]] is the measure of the amount of missing information before reception.<ref>{{cite book|first=Roger |last=Balian |chapter=Entropy, a Protean concept |editor-last=Dalibard |editor-first=Jean |title=Poincaré Seminar 2003: Bose-Einstein condensation – entropy |year=2004 |publisher=Birkhäuser |location=Basel |isbn=978-3-7643-7116-6 |pages=119–144}}</ref> Often called ''Shannon entropy'', it was originally devised by [[Claude Shannon]] in 1948 to study the size of information of a transmitted message. The definition of information entropy is expressed in terms of a discrete set of probabilities <math display="inline">p_i</math> so that:<math display="block">H(X) = - \sum_{i=1}^n{p(x_i) \log{p(x_i)}}</math>where the base of the logarithm determines the units (for example, the [[binary logarithm]] corresponds to [[bit]]s). In the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of the average size of information of a message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of binary questions needed to determine the content of the message.<ref name="Perplexed" /> Most researchers consider information entropy and thermodynamic entropy directly linked to the same concept,<ref>{{Cite book|last=Brillouin|first=Leon|title=Science and Information Theory|year= 1956|publisher=Dover Publications |isbn=978-0-486-43918-1}}</ref><ref name="Georgescu-Roegen 1971">{{Cite book|last=Georgescu-Roegen|first=Nicholas|title=The Entropy Law and the Economic Process|publisher=Harvard University Press|year=1971|isbn=978-0-674-25781-8 |url = https://archive.org/details/entropylawe00nich}}</ref><ref>{{Cite book|last=Chen|first=Jing|title=The Physical Foundation of Economics – an Analytical Thermodynamic Theory|publisher=World Scientific|year=2005|isbn=978-981-256-323-1}}</ref><ref>{{cite journal | last1 = Kalinin | first1 = M.I. | last2 = Kononogov | first2 = S.A. | year = 2005 | title = Boltzmann's constant | journal = Measurement Techniques | volume = 48 | issue = 7| pages = 632–636 | doi=10.1007/s11018-005-0195-9| bibcode = 2005MeasT..48..632K | s2cid = 118726162 }}</ref><ref>{{cite book|last1=Ben-Naim|first1=Arieh|title=Entropy demystified the second law reduced to plain common sense|url=https://archive.org/details/entropydemystifi0000benn|url-access=registration|date= 2008|publisher=World Scientific|location=Singapore|isbn=9789812832269|edition=Expanded}}</ref> while others argue that they are distinct.<ref>{{cite book|first1=Joseph J.|last1=Vallino|first2=Christopher K. |last2=Algar|first3=Nuria Fernández|last3=González|first4=Julie A.|last4=Huber|editor-first1=Roderick C.|editor-last1=Dewar|editor-first2=Charles H. |editor-last2=Lineweaver|editor-first3=Robert K.|editor-last3=Niven|editor-first4=Klaus|editor-last4=Regenauer-Lieb|date= 2013|chapter=Use of Receding Horizon Optimal Control to Solve MaxEP-Based (max entropy production) Biogeochemistry Problems |department=Living Systems as Catalysts|chapter-url=https://books.google.com/books?id=xF65BQAAQBAJ&pg=PA340|title=Beyond the Second Law: Entropy Production & Non-equilibrium Systems|page=340|isbn=978-3642401534|publisher=Springer |access-date=31 August 2019 |quote=...ink on the page forms a pattern that contains information, the entropy of the page is lower than a page with randomized letters; however, the reduction of entropy is trivial compared to the entropy of the paper the ink is written on. If the paper is burned, it hardly matters in a thermodynamic context if the text contains the meaning of life or only {{sic|jibberish}}.}}</ref> Both expressions are mathematically similar. If <math display="inline">W</math> is the number of microstates that can yield a given macrostate, and each microstate has the same ''[[A priori knowledge|a priori]]'' probability, then that probability is <math display="inline">p = 1/W</math>. The Shannon entropy (in [[Nat (unit)|nats]]) is:<math display="block">H = - \sum_{i=1}^W{p_i \ln{p_i}} = \ln{W}</math>and if entropy is measured in units of <math display="inline">k</math> per nat, then the entropy is given by:<math display="block">H = k \ln{W}</math>which is the [[Boltzmann's entropy formula|Boltzmann entropy formula]], where <math display="inline">k</math> is the Boltzmann constant, which may be interpreted as the thermodynamic entropy per nat. Some authors argue for dropping the word entropy for the <math display="inline">H</math> function of information theory and using Shannon's other term, "uncertainty", instead.<ref>Schneider, Tom, DELILA system (Deoxyribonucleic acid Library Language), (Information Theory Analysis of binding sites), Laboratory of Mathematical Biology, National Cancer Institute, Frederick, MD.</ref> === Measurement === The entropy of a substance can be measured, although in an indirect way. The measurement, known as entropymetry,<ref>{{Cite journal|last1=Kim|first1=Hye Jin|last2=Park|first2=Youngkyu|last3=Kwon|first3=Yoonjin|last4=Shin|first4=Jaeho|last5=Kim|first5=Young-Han|last6=Ahn|first6=Hyun-Seok|last7=Yazami|first7=Rachid|last8=Choi|first8=Jang Wook|year=2020|title=Entropymetry for non-destructive structural analysis of LiCoO 2 cathodes|url=http://xlink.rsc.org/?DOI=C9EE02964H|journal=Energy & Environmental Science|language=en|volume=13|issue=1|pages=286–296|doi=10.1039/C9EE02964H|bibcode=2020EnEnS..13..286K |s2cid=212779004|issn=1754-5692}}</ref> is done on a closed system with constant number of particles <math display="inline">N</math> and constant volume <math display="inline">V</math>, and it uses the definition of temperature<ref>{{cite book|last1=Schroeder|first1=Daniel V.|title=An introduction to thermal physics|url=https://archive.org/details/introductiontoth00schr_817|url-access=limited|date=2000|publisher=Addison Wesley|location=San Francisco, CA [u.a.]|isbn=978-0-201-38027-9|page=[https://archive.org/details/introductiontoth00schr_817/page/n99 88]|edition=[Nachdr.]}}</ref> in terms of entropy, while limiting energy exchange to heat <math display="inline">\mathrm{d} U \rightarrow \mathrm{d} Q</math>:<math display="block">T := {\left( \frac{\partial U}{\partial S} \right)}_{V, N}\ \Rightarrow\ \cdots\ \Rightarrow\ \mathrm{d} S = \frac{\mathrm{d} Q}{T}</math>The resulting relation describes how entropy changes <math display="inline">\mathrm{d} S</math> when a small amount of energy <math display="inline">\mathrm{d} Q</math> is introduced into the system at a certain temperature <math display="inline">T</math>. The process of measurement goes as follows. First, a sample of the substance is cooled as close to absolute zero as possible. At such temperatures, the entropy approaches zero{{snd}}due to the definition of temperature. Then, small amounts of heat are introduced into the sample and the change in temperature is recorded, until the temperature reaches a desired value (usually 25 °C). The obtained data allows the user to integrate the equation above, yielding the absolute value of entropy of the substance at the final temperature. This value of entropy is called calorimetric entropy.<ref>{{cite web|title=Measuring Entropy|url=https://www.chem.wisc.edu/deptfiles/genchem/netorial/modules/thermodynamics/entropy/entropy04.htm|website=chem.wisc.edu}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)