Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Classical thermodynamics === {{Main|Entropy (classical thermodynamics)}} {{Conjugate variables (thermodynamics)}} The thermodynamic definition of entropy was developed in the early 1850s by [[Rudolf Clausius]] and essentially describes how to measure the entropy of an [[isolated system]] in [[thermodynamic equilibrium]] with its parts. Clausius created the term entropy as an [[Intensive and extensive properties|extensive]] thermodynamic variable that was shown to be useful in characterizing the [[Carnot cycle]]. Heat transfer in the isotherm steps (isothermal expansion and isothermal compression) of the Carnot cycle was found to be proportional to the temperature of a system (known as its [[absolute temperature]]). This relationship was expressed in an increment of entropy that is equal to incremental heat transfer divided by temperature. Entropy was found to vary in the thermodynamic cycle but eventually returned to the same value at the end of every cycle. Thus it was found to be a [[function of state]], specifically a thermodynamic state of the system. While Clausius based his definition on a reversible process, there are also irreversible processes that change entropy. Following the [[second law of thermodynamics]], entropy of an isolated [[Thermodynamic system|system]] always increases for irreversible processes. The difference between an isolated system and closed system is that energy may ''not'' flow to and from an isolated system, but energy flow to and from a closed system is possible. Nevertheless, for both closed and isolated systems, and indeed, also in open systems, irreversible thermodynamics processes may occur. According to the [[Clausius theorem|Clausius equality]], for a reversible cyclic thermodynamic process: <math display="block">\oint{\frac{\delta Q_\mathsf{rev}}{T}} = 0</math>which means the line integral <math display="inline">\int_L{\delta Q_\mathsf{rev} / T}</math> is [[State function|path-independent]]. Thus we can define a state function <math display="inline">S</math>, called ''entropy'':<math display="block">\mathrm{d} S = \frac{\delta Q_\mathsf{rev}}{T}</math>Therefore, thermodynamic entropy has the dimension of energy divided by temperature, and the unit [[joule]] per [[kelvin]] (J/K) in the International System of Units (SI). To find the entropy difference between any two states of the system, the integral must be evaluated for some reversible path between the initial and final states.<ref>{{Cite book|last=Atkins|first=Peter|author2=Julio De Paula|title=Physical Chemistry, 8th ed.|publisher=Oxford University Press|year=2006|page=79|isbn=978-0-19-870072-2}}</ref> Since an entropy is a state function, the entropy change of the system for an irreversible path is the same as for a reversible path between the same two states.<ref>{{Cite book|last=Engel|first=Thomas|author2=Philip Reid|title=Physical Chemistry|publisher=Pearson Benjamin Cummings|year=2006|page=86|isbn=978-0-8053-3842-3}}</ref> However, the heat transferred to or from the surroundings is different as well as its entropy change. We can calculate the change of entropy only by integrating the above formula. To obtain the absolute value of the entropy, we consider the [[third law of thermodynamics]]: perfect crystals at the [[absolute zero]] have an entropy <math display="inline">S = 0</math>. From a macroscopic perspective, in [[classical thermodynamics]] the entropy is interpreted as a [[state function]] of a [[thermodynamic system]]: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. In any process, where the system gives up <math>\Delta E</math> of energy to the surrounding at the temperature <math display="inline">T</math>, its entropy falls by <math display="inline">\Delta S</math> and at least <math display="inline">T \cdot \Delta S</math> of that energy must be given up to the system's surroundings as a heat. Otherwise, this process cannot go forward. In classical thermodynamics, the entropy of a system is defined if and only if it is in a [[thermodynamic equilibrium]] (though a [[chemical equilibrium]] is not required: for example, the entropy of a mixture of two moles of hydrogen and one mole of oxygen in [[Standard temperature and pressure|standard conditions]] is well-defined).
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)