Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Statistical mechanics === {{main|Entropy (statistical thermodynamics)}} The statistical definition was developed by [[Ludwig Boltzmann]] in the 1870s by analysing the statistical behaviour of the microscopic components of the system. Boltzmann showed that this definition of entropy was equivalent to the thermodynamic entropy to within a constant factor—known as the [[Boltzmann constant]]. In short, the thermodynamic definition of entropy provides the experimental verification of entropy, while the statistical definition of entropy extends the concept, providing an explanation and a deeper understanding of its nature. The [[Entropy (statistical thermodynamics)|interpretation of entropy in statistical mechanics]] is the measure of uncertainty, disorder, or ''mixedupness'' in the phrase of [[Josiah Willard Gibbs|Gibbs]], which remains about a system after its observable macroscopic properties, such as temperature, pressure and volume, have been taken into account. For a given set of macroscopic variables, the entropy measures the degree to which the probability of the system is spread out over different possible [[Microstate (statistical mechanics)|microstates]]. In contrast to the macrostate, which characterizes plainly observable average quantities, a microstate specifies all molecular details about the system including the position and momentum of every molecule. The more such states are available to the system with appreciable probability, the greater the entropy. In statistical mechanics, entropy is a measure of the number of ways a system can be arranged, often taken to be a measure of "disorder" (the higher the entropy, the higher the disorder).<ref name=McH>{{cite book |last1=Licker |first1=Mark D. |title=McGraw-Hill concise encyclopedia of chemistry |date=2004 |publisher=McGraw-Hill Professional |location=New York |isbn=978-0-07-143953-4}}</ref><ref name="Sethna78" /><ref>{{cite book |last1=Clark |first1=John O. E. |title=The essential dictionary of science |date=2004 |publisher=Barnes & Noble |location=New York |isbn=978-0-7607-4616-5}}</ref> This definition describes the entropy as being proportional to the natural logarithm of the number of possible microscopic configurations of the individual atoms and molecules of the system ([[microstate (statistical mechanics)|microstates]]) that could cause the observed macroscopic state ([[macrostate]]) of the system. The constant of proportionality is the [[Boltzmann constant]]. The Boltzmann constant, and therefore entropy, have [[dimension (physics)|dimensions]] of energy divided by temperature, which has a unit of [[joule]]s per [[kelvin]] (J⋅K<sup>−1</sup>) in the [[International System of Units]] (or kg⋅m<sup>2</sup>⋅s<sup>−2</sup>⋅K<sup>−1</sup> in terms of base units). The entropy of a substance is usually given as an [[Intensive and extensive properties#Intensive properties|intensive property]] — either entropy per unit [[mass]] (SI unit: J⋅K<sup>−1</sup>⋅kg<sup>−1</sup>) or entropy per unit [[amount of substance]] (SI unit: J⋅K<sup>−1</sup>⋅mol<sup>−1</sup>). Specifically, entropy is a [[logarithmic scale|logarithmic]] measure for the system with a number of states, each with a probability <math display="inline">p_i</math> of being occupied (usually given by the [[Boltzmann distribution]]):<math display="block">S = - k_\mathsf{B} \sum_i{p_i \ln{p_i}}</math>where <math display="inline">k_\mathsf{B}</math> is the [[Boltzmann constant]] and the summation is performed over all possible microstates of the system.<ref name="Perplexed">[http://charlottewerndl.net/Entropy_Guide.pdf Frigg, R. and Werndl, C. "Entropy – A Guide for the Perplexed"] {{Webarchive|url=https://web.archive.org/web/20110813112247/http://charlottewerndl.net/Entropy_Guide.pdf |date=13 August 2011 }}. In ''Probabilities in Physics''; Beisbart C. and Hartmann, S. (eds.); Oxford University Press, Oxford, 2010.</ref> In case states are defined in a continuous manner, the summation is replaced by an [[integral]] over all possible states, or equivalently we can consider the [[expected value]] of [[Entropy (information theory)#Rationale|the logarithm of the probability]] that a microstate is occupied:<math display="block">S = - k_\mathsf{B} \left\langle \ln{p} \right\rangle</math>This definition assumes the basis states to be picked in a way that there is no information on their relative phases. In a general case the expression is:<math display="block">S = - k_\mathsf{B}\ \mathrm{tr}{\left( \hat{\rho} \times \ln{\hat{\rho}} \right)}</math>where <math display="inline">\hat{\rho}</math> is a [[density matrix]], <math>\mathrm{tr}</math> is a [[Trace class|trace operator]] and <math>\ln</math> is a [[matrix logarithm]]. The density matrix formalism is not required if the system is in thermal equilibrium so long as the basis states are chosen to be [[Quantum state|eigenstates]] of the [[Hamiltonian (quantum mechanics)|Hamiltonian]]. For most practical purposes it can be taken as the fundamental definition of entropy since all other formulae for <math display="inline">S</math> can be derived from it, but not vice versa. In what has been called ''[[Fundamental postulate of statistical mechanics|the fundamental postulate in statistical mechanics]]'', among system microstates of the same energy (i.e., [[Degenerate energy levels|degenerate microstates]]) each microstate is assumed to be populated with equal probability <math display="inline">p_i = 1 / \Omega</math>, where <math display="inline">\Omega</math> is the number of microstates whose energy equals that of the system. Usually, this assumption is justified for an isolated system in a thermodynamic equilibrium.<ref>{{cite book|last1=Schroeder|first1=Daniel V.|title=An introduction to thermal physics|url=https://archive.org/details/introductiontoth00schr_817|url-access=limited|date=2000|publisher=Addison Wesley|location=San Francisco, CA |isbn=978-0-201-38027-9|page=[https://archive.org/details/introductiontoth00schr_817/page/n68 57]}}</ref> Then in case of an isolated system the previous formula reduces to:<math display="block">S = k_\mathsf{B} \ln{\Omega}</math>In thermodynamics, such a system is one with a fixed volume, number of molecules, and internal energy, called a [[microcanonical ensemble]]. The most general interpretation of entropy is as a measure of the extent of uncertainty about a system. The [[equilibrium state]] of a system maximizes the entropy because it does not reflect all information about the initial conditions, except for the conserved variables. This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model.<ref>{{Cite journal|last=Jaynes|first=E. T.|date=1 May 1965|title=Gibbs vs Boltzmann Entropies|url=https://aapt.scitation.org/doi/10.1119/1.1971557|journal=American Journal of Physics|volume=33|issue=5|pages=391–398|doi=10.1119/1.1971557|bibcode=1965AmJPh..33..391J|issn=0002-9505}}</ref> The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has deep implications when two observers use different sets of macroscopic variables. For example, consider observer A using variables <math display="inline">U</math>, <math display="inline">V</math>, <math display="inline">W</math> and observer B using variables <math display="inline">U</math>, <math display="inline">V</math>, <math display="inline">W</math>, <math display="inline">X</math>. If observer B changes variable <math display="inline">X</math>, then observer A will see a violation of the second law of thermodynamics, since he does not possess information about variable <math display="inline">X</math> and its influence on the system. In other words, one must choose a complete set of macroscopic variables to describe the system, i.e. every independent parameter that may change during experiment.<ref>{{cite book |url=http://www.mdpi.org/lin/entropy/cgibbs.pdf |author=Jaynes, E. T. |chapter=The Gibbs Paradox |title=Maximum Entropy and Bayesian Methods |editor1=Smith, C. R. |editor2=Erickson, G. J. |editor3=Neudorfer, P. O. |publisher=Kluwer Academic: Dordrecht |year=1992 |pages=1–22 |access-date=17 August 2012}}</ref> Entropy can also be defined for any [[Markov process]]es with [[reversible dynamics]] and the [[detailed balance]] property. In Boltzmann's 1896 ''Lectures on Gas Theory'', he showed that this expression gives a measure of entropy for systems of atoms and molecules in the gas phase, thus providing a measure for the entropy of classical thermodynamics.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)