Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Applications == === The fundamental thermodynamic relation === {{Main|Fundamental thermodynamic relation}} The entropy of a system depends on its internal energy and its external parameters, such as its volume. In the thermodynamic limit, this fact leads to an equation relating the change in the internal energy <math display="inline">U</math> to changes in the entropy and the external parameters. This relation is known as the ''fundamental thermodynamic relation''. If external pressure <math display="inline">p</math> bears on the volume <math display="inline">V</math> as the only external parameter, this relation is:<math display="block">\mathrm{d} U = T\ \mathrm{d} S - p\ \mathrm{d} V</math>Since both internal energy and entropy are monotonic functions of temperature <math display="inline">T</math>, implying that the internal energy is fixed when one specifies the entropy and the volume, this relation is valid even if the change from one state of thermal equilibrium to another with infinitesimally larger entropy and volume happens in a non-quasistatic way (so during this change the system may be very far out of thermal equilibrium and then the whole-system entropy, pressure, and temperature may not exist). The fundamental thermodynamic relation implies many thermodynamic identities that are valid in general, independent of the microscopic details of the system. Important examples are the [[Maxwell relations]] and the [[relations between heat capacities]]. === Entropy in chemical thermodynamics === Thermodynamic entropy is central in [[chemical thermodynamics]], enabling changes to be quantified and the outcome of reactions predicted. The [[second law of thermodynamics]] states that entropy in an [[isolated system]] — the combination of a subsystem under study and its surroundings — increases during all spontaneous chemical and physical processes. The [[Clausius theorem|Clausius equation]] introduces the measurement of entropy change which describes the direction and quantifies the magnitude of simple changes such as heat transfer between systems — always from hotter body to cooler one spontaneously. Thermodynamic entropy is an [[Intensive and extensive properties|extensive]] property, meaning that it scales with the size or extent of a system. In many processes it is useful to specify the entropy as an [[Intensive and extensive properties|intensive property]] independent of the size, as a specific entropy characteristic of the type of system studied. Specific entropy may be expressed relative to a unit of mass, typically the kilogram (unit: J⋅kg<sup>−1</sup>⋅K<sup>−1</sup>). Alternatively, in chemistry, it is also referred to one [[Mole (unit)|mole]] of substance, in which case it is called the ''molar entropy'' with a unit of J⋅mol<sup>−1</sup>⋅K<sup>−1</sup>. Thus, when one mole of substance at about {{val|0|u=K}} is warmed by its surroundings to {{val|298|u=K}}, the sum of the incremental values of <math display="inline">q_\mathsf{rev} / T</math> constitute each element's or compound's standard molar entropy, an indicator of the amount of energy stored by a substance at {{val|298|u=K}}.<ref name="ctms">{{Cite book|last=Moore|first=J. W.|author2=C. L. Stanistski|author3=P. C. Jurs|title=Chemistry, The Molecular Science|publisher=Brooks Cole|year=2005|isbn=978-0-534-42201-1|url-access=registration|url=https://archive.org/details/chemistrymolecul0000moor}}</ref><ref name="Jungermann">{{cite journal|last1=Jungermann|first1=A.H.|s2cid=18081336|year=2006|title=Entropy and the Shelf Model: A Quantum Physical Approach to a Physical Property|journal=Journal of Chemical Education|volume=83|issue=11|pages=1686–1694|doi=10.1021/ed083p1686|bibcode = 2006JChEd..83.1686J}}</ref> Entropy change also measures the mixing of substances as a summation of their relative quantities in the final mixture.<ref>{{Cite book|last=Levine|first=I. N.|title=Physical Chemistry, 5th ed.|url=https://archive.org/details/physicalchemistr00levi_1|url-access=registration|publisher=McGraw-Hill|year=2002|isbn=978-0-07-231808-1}}</ref> Entropy is equally essential in predicting the extent and direction of complex chemical reactions. For such applications, <math display="inline">\Delta S</math> must be incorporated in an expression that includes both the system and its surroundings: <math display="block">\Delta S_\mathsf{universe} = \Delta S_\mathsf{surroundings} + \Delta S_\mathsf{system}</math>Via additional steps this expression becomes the equation of [[Gibbs free energy]] change <math display="inline">\Delta G</math> for reactants and products in the system at the constant pressure and temperature <math display="inline">T</math>:<math display="block">\Delta G = \Delta H - T\ \Delta S</math>where <math display="inline">\Delta H</math> is the [[enthalpy]] change and <math display="inline">\Delta S</math> is the entropy change.<ref name="ctms" /> {| class="wikitable" !'''ΔH''' !'''ΔS''' !'''Spontaneity''' !'''Example''' |- | + | + |Spontaneous '''at high ''T''''' |Ice melting |- |– |– |Spontaneous '''at low ''T''''' |Water freezing |- |– | + |Spontaneous '''at all ''T''''' |Propane combustion |- | + |– |'''Non-spontaneous''' at all ''T'' |Ozone formation |} The spontaneity of a chemical or physical process is governed by the [[Gibbs free energy]] change (ΔG), as defined by the equation ΔG = ΔH − TΔS, where ΔH represents the enthalpy change, ΔS the entropy change, and T the temperature in Kelvin. A negative ΔG indicates a thermodynamically favorable ([[Spontaneous process|spontaneous]]) process, while a positive ΔG denotes a non-spontaneous one. When both ΔH and ΔS are positive ([[Endothermic process|endothermic]], entropy-increasing), the reaction becomes spontaneous at sufficiently high temperatures, as the TΔS term dominates. Conversely, if both ΔH and ΔS are negative (exothermic, entropy-decreasing), spontaneity occurs only at low temperatures, where the enthalpy term prevails. Reactions with ΔH < 0 and ΔS > 0 ([[Exothermic process|exothermic]] and entropy-increasing) are spontaneous at all temperatures, while those with ΔH > 0 and ΔS < 0 (endothermic and entropy-decreasing) are non-spontaneous regardless of temperature. These principles underscore the interplay between energy exchange, disorder, and temperature in determining the direction of natural processes, from phase transitions to biochemical reactions. ---- === World's technological capacity to store and communicate entropic information === {{See also|Entropy (information theory)}} A 2011 study in ''[[Science (journal)|Science]]'' estimated the world's technological capacity to store and communicate optimally compressed information normalised on the most effective compression algorithms available in the year 2007, therefore estimating the entropy of the technologically available sources.<ref name="HilbertLopez2011">{{Cite journal|last1=Hilbert|first1=Martin|last2=López|first2=Priscila|date=11 February 2011|title=The World's Technological Capacity to Store, Communicate, and Compute Information|journal=Science|language=en|volume=332|issue=6025|pages=60–65|doi=10.1126/science.1200970|pmid=21310967 |bibcode=2011Sci...332...60H |s2cid=206531385 |issn=0036-8075|doi-access=free}}</ref> The author's estimate that humankind's technological capacity to store information grew from 2.6 (entropically compressed) [[exabytes]] in 1986 to 295 (entropically compressed) [[exabytes]] in 2007. The world's technological capacity to receive information through one-way broadcast networks was 432 [[exabytes]] of (entropically compressed) information in 1986, to 1.9 [[zettabytes]] in 2007. The world's effective capacity to exchange information through two-way telecommunication networks was 281 [[petabytes]] of (entropically compressed) information in 1986, to 65 (entropically compressed) [[exabytes]] in 2007.<ref name="HilbertLopez2011"/> === Entropy balance equation for open systems === [[File:First law open system.svg|thumb|upright=1.4|During [[Steady-state (chemical engineering)|steady-state]] continuous operation, an entropy balance applied to an open system accounts for system entropy changes related to heat flow and mass flow across the system boundary.]] In [[chemical engineering]], the principles of thermodynamics are commonly applied to "[[Open system (systems theory)|open systems]]", i.e. those in which heat, [[work (thermodynamics)|work]], and [[mass]] flow across the system boundary. In general, flow of heat <math display="inline">\dot{Q}</math>, flow of shaft work <math display="inline"> \dot{W}_\mathsf{S} </math> and pressure-volume work <math display="inline">P \dot{V}</math> across the system boundaries cause changes in the entropy of the system. Heat transfer entails entropy transfer <math display="inline">\dot{Q}/T</math>, where <math display="inline">T</math> is the absolute [[thermodynamic temperature]] of the system at the point of the heat flow. If there are mass flows across the system boundaries, they also influence the total entropy of the system. This account, in terms of heat and work, is valid only for cases in which the work and heat transfers are by paths physically distinct from the paths of entry and exit of matter from the system.<ref>{{cite book|author=Late Nobel Laureate Max Born|title=Natural Philosophy of Cause and Chance|url=https://books.google.com/books?id=er85jgEACAAJ|date=8 August 2015|publisher=BiblioLife|isbn=978-1-298-49740-6|pages=44, 146–147}}</ref><ref>{{cite book|last1=Haase|first1=R.|title=Thermodynamics|date=1971|publisher=Academic Press|location=New York|isbn=978-0-12-245601-5|pages=1–97}}</ref> To derive a generalised entropy balanced equation, we start with the general balance equation for the change in any [[extensive quantity]] <math display="inline">\theta</math> in a [[thermodynamic system]], a quantity that may be either conserved, such as energy, or non-conserved, such as entropy. The basic generic balance expression states that <math display="inline">\mathrm{d} \theta / \mathrm{d} t</math>, i.e. the rate of change of <math display="inline">\theta</math> in the system, equals the rate at which <math display="inline">\theta</math> enters the system at the boundaries, minus the rate at which <math display="inline">\theta</math> leaves the system across the system boundaries, plus the rate at which <math display="inline">\theta</math> is generated within the system. For an open thermodynamic system in which heat and work are transferred by paths separate from the paths for transfer of matter, using this generic balance equation, with respect to the rate of change with time <math display="inline">t</math> of the extensive quantity entropy <math display="inline">S</math>, the entropy balance equation is:<ref name="Pokrovskii 2020">{{Cite book|url=|title= Thermodynamics of Complex Systems: Principles and applications. |last= Pokrovskii |first=Vladimir|language=English | publisher= IOP Publishing, Bristol, UK.|year=2020|isbn=|pages=|bibcode= 2020tcsp.book.....P }}</ref><ref>{{Cite book|last=Sandler|first=Stanley, I.|title=Chemical and Engineering Thermodynamics|publisher=John Wiley & Sons|year=1989|isbn=978-0-471-83050-4}}</ref><ref group="note" name=overdot>The overdots represent derivatives of the quantities with respect to time.</ref><math display="block">\frac{\mathrm{d} S}{\mathrm{d} t} = \sum_{k=1}^K{\dot{M}_k \hat{S}_k + \frac{\dot{Q}}{T} + \dot{S}_\mathsf{gen}}</math>where <math display="inline">\sum_{k=1}^K{\dot{M}_k \hat{S}_k}</math> is the net rate of entropy flow due to the flows of mass <math display="inline">\dot{M}_k </math> into and out of the system with entropy per unit mass <math display="inline">\hat{S}_k</math>, <math display="inline">\dot{Q} / T</math> is the rate of entropy flow due to the flow of heat across the system boundary and <math display="inline">\dot{S}_\mathsf{gen}</math> is the rate of [[entropy production|entropy generation]] within the system, e.g. by [[chemical reaction]]s, [[phase transition]]s, internal heat transfer or [[Friction|frictional effects]] such as [[viscosity]]. In case of multiple heat flows the term <math display="inline">\dot{Q}/T</math> is replaced by <math display="inline">\sum_j{\dot{Q}_j/T_j}</math>, where <math display="inline">\dot{Q}_j</math> is the heat flow through <math display="inline">j</math>-th port into the system and <math display="inline">T_j</math> is the temperature at the <math display="inline">j</math>-th port. The nomenclature "entropy balance" is misleading and often deemed inappropriate because entropy is not a conserved quantity. In other words, the term <math display="inline">\dot{S}_\mathsf{gen}</math> is never a known quantity but always a derived one based on the expression above. Therefore, the open system version of the second law is more appropriately described as the "entropy generation equation" since it specifies that:<math display="block">\dot{S}_\mathsf{gen} \ge 0</math>with zero for reversible process and positive values for irreversible one.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)