Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Boltzmann distribution
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Probability distribution of energy states of a system}} {{about|system energy states|particle energy levels and velocities|Maxwell–Boltzmann distribution}} {{Use American English|date = March 2019}} [[File:Exponential probability density.svg|upright=1.75|right|thumb|Boltzmann's distribution is an exponential distribution.]] [[File:Boltzmann distribution graph.svg|upright=1.75|right|thumb|Boltzmann factor {{tmath|\tfrac{p_i}{p_j} }} (vertical axis) as a function of temperature {{mvar|T}} for several energy differences {{math|''ε<sub>i</sub>'' − ''ε<sub>j</sub>''}}.]] In [[statistical mechanics]] and [[mathematics]], a '''Boltzmann distribution''' (also called '''Gibbs distribution'''<ref name ="landau">{{cite book | author=Landau, Lev Davidovich |author2=Lifshitz, Evgeny Mikhailovich |name-list-style=amp | title=Statistical Physics |volume=5 |series=Course of Theoretical Physics |edition=3 |orig-year=1976 |year=1980 |place=Oxford |publisher=Pergamon Press|isbn=0-7506-3372-7|author-link=Lev Landau |author2-link=Evgeny Lifshitz }} Translated by J.B. Sykes and M.J. Kearsley. See section 28</ref>) is a [[probability distribution]] or [[probability measure]] that gives the probability that a system will be in a certain [[microstate (statistical mechanics)|state]] as a function of that state's energy and the temperature of the system. The distribution is expressed in the form: :<math>p_i \propto \exp\left(- \frac{\varepsilon_i}{kT} \right)</math> where {{mvar|p<sub>i</sub>}} is the probability of the system being in state {{mvar|i}}, {{math|exp}} is the [[exponential function]], {{mvar|ε<sub>i</sub>}} is the energy of that state, and a constant {{mvar|kT}} of the distribution is the product of the [[Boltzmann constant]] {{mvar|k}} and [[thermodynamic temperature]] {{mvar|T}}. The symbol <math display="inline">\propto</math> denotes [[proportionality (mathematics)|proportionality]] (see {{section link||The distribution}} for the proportionality constant). The term ''system'' here has a wide meaning; it can range from a collection of 'sufficient number' of atoms or a single atom{{r|landau}} to a macroscopic system such as a [[Natural gas storage|natural gas storage tank]]. Therefore, the Boltzmann distribution can be used to solve a wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied. The ''ratio'' of probabilities of two states is known as the '''Boltzmann factor''' and characteristically only depends on the states' energy difference: :<math>\frac{p_i}{p_j} = \exp\left( \frac{\varepsilon_j - \varepsilon_i}{kT} \right)</math> The Boltzmann distribution is named after [[Ludwig Boltzmann]] who first formulated it in 1868 during his studies of the [[statistical mechanics]] of gases in [[thermal equilibrium]].<ref>{{cite journal |last=Boltzmann |first=Ludwig |author-link=Ludwig Boltzmann |year=1868 |title=Studien über das Gleichgewicht der lebendigen Kraft zwischen bewegten materiellen Punkten |trans-title=Studies on the balance of living force between moving material points |journal=Wiener Berichte |volume=58 |pages=517–560 }}</ref> Boltzmann's statistical work is borne out in his paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium"<ref>{{Cite web |url=http://crystal.med.upenn.edu/sharp-lab-pdfs/2015SharpMatschinsky_Boltz1877_Entropy17.pdf |title=Translation of Ludwig Boltzmann's Paper "On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium" |access-date=2017-05-11 |archive-date=2020-10-21 |archive-url=https://web.archive.org/web/20201021205227/http://crystal.med.upenn.edu/sharp-lab-pdfs/2015SharpMatschinsky_Boltz1877_Entropy17.pdf |url-status=dead }}</ref> <!-- It would be nice to have a citation here! The origin of the Boltzmann factor isn't entirely clear. According to some authors, Boltzmann's 1968 paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium” "Studien über das Gleichgewicht der lebendigen Kraft zwischen bewegten materiellen Punkten" is the origin but I can't find this article at the moment, so I cannot confirm. For example, this book says so, but uses suspiciously modern terminology https://books.google.com/books?id=u13KiGlz2zcC&pg=PA93 On the other hand, Uffink's "Compendium of the foundations of classical statistical physics" does not seem to indicate quite this equation but rather that Boltzmann's 1968 distribution was the simple Maxwell–Boltzmann distribution (for a classical nonrelativistic gas), modified for particles in a potential. --> The distribution was later investigated extensively, in its modern generic form, by [[Josiah Willard Gibbs]] in 1902.<ref name="gibbs">{{cite book |last=Gibbs |first=Josiah Willard |author-link=Josiah Willard Gibbs |title=Elementary Principles in Statistical Mechanics |year=1902 |publisher=[[Charles Scribner's Sons]] |location=New York|title-link=Elementary Principles in Statistical Mechanics }}</ref> The Boltzmann distribution should not be confused with the [[Maxwell–Boltzmann distribution]] or [[Maxwell–Boltzmann statistics|Maxwell-Boltzmann statistics]]. The Boltzmann distribution gives the probability that a system will be in a certain ''state'' as a function of that state's energy,<ref name="Atkins, P. W. 2010">Atkins, P. W. (2010) Quanta, W. H. Freeman and Company, New York</ref> while the Maxwell-Boltzmann distributions give the probabilities of particle ''speeds'' or ''energies'' in ideal gases. The distribution of energies in a <em>one-dimensional</em> gas however, does follow the Boltzmann distribution. ==The distribution== The Boltzmann distribution is a [[probability distribution]] that gives the probability of a certain state as a function of that state's energy and temperature of the [[system]] to which the distribution is applied.<ref name="McQuarrie, A. 2000">{{cite book |last=McQuarrie |first=A. |year=2000 |title=Statistical Mechanics |publisher=University Science Books |location=Sausalito, CA |isbn=1-891389-15-7 }}</ref> It is given as <math display="block"> p_i=\frac{1}{Q} \exp\left(- \frac{\varepsilon_i}{kT} \right) = \frac{ \exp\left(- \tfrac{\varepsilon_i}{kT} \right) }{ \displaystyle \sum_{j=1}^{M} \exp\left(- \tfrac{\varepsilon_j}{kT} \right) } </math> where: *{{math|exp()}} is the [[exponential function]], *{{mvar|p<sub>i</sub>}} is the probability of state {{mvar|i}}, *{{mvar|ε<sub>i</sub>}} is the energy of state {{mvar|i}}, *{{mvar|k}} is the [[Boltzmann constant]], *{{mvar|T}} is the [[absolute temperature]] of the system, *{{mvar|M}} is the number of all states accessible to the system of interest,<ref name="McQuarrie, A. 2000"/><ref name="Atkins, P. W. 2010"/> *{{mvar|Q}} (denoted by some authors by {{mvar|Z}}) is the normalization denominator, which is the [[canonical partition function]]<math display=block> Q = \sum_{j=1}^{M} \exp\left(- \tfrac{\varepsilon_j}{kT} \right) </math> It results from the constraint that the probabilities of all accessible states must add up to 1. Using [[Lagrange multipliers]], one can prove that the Boltzmann distribution is the distribution that maximizes the [[entropy]] <math display=block>S(p_1,p_2,\cdots,p_M) = -\sum_{i=1}^{M} p_i\log_2 p_i</math> subject to the normalization constraint that <math display="inline">\sum p_i=1</math> and the constraint that <math display="inline">\sum {p_i {\varepsilon}_i}</math> equals a particular mean energy value, except for two special cases. (These special cases occur when the mean value is either the minimum or maximum of the energies {{mvar|ε<sub>i</sub>}}. In these cases, the entropy maximizing distribution is a limit of Boltzmann distributions where {{mvar|T}} approaches zero from above or below, respectively.) The partition function can be calculated if we know the energies of the states accessible to the system of interest. For atoms the partition function values can be found in the [[National Institute of Standards and Technology|NIST]] Atomic Spectra Database.<ref>[http://physics.nist.gov/PhysRefData/ASD/levels_form.html NIST Atomic Spectra Database Levels Form] at nist.gov</ref> The distribution shows that states with lower energy will always have a higher probability of being occupied than the states with higher energy. It can also give us the quantitative relationship between the probabilities of the two states being occupied. The ratio of probabilities for states {{mvar|i}} and {{mvar|j}} is given as <math display=block>\frac{p_i}{p_j} = \exp\left( \frac{\varepsilon_j - \varepsilon_i}{kT} \right)</math> where: *{{mvar|p<sub>i</sub>}} is the probability of state {{mvar|i}}, *{{mvar|p<sub>j</sub>}} the probability of state {{mvar|j}}, *{{mvar|ε<sub>i</sub>}} is the energy of state {{mvar|i}}, *{{mvar|ε<sub>j</sub>}} is the energy of state {{mvar|j}}. The corresponding ratio of populations of energy levels must also take their [[Degeneracy (quantum mechanics)|degeneracies]] into account. The Boltzmann distribution is often used to describe the distribution of particles, such as atoms or molecules, over bound states accessible to them. If we have a system consisting of many particles, the probability of a particle being in state {{mvar|i}} is practically the probability that, if we pick a random particle from that system and check what state it is in, we will find it is in state {{mvar|i}}. This probability is equal to the number of particles in state {{mvar|i}} divided by the total number of particles in the system, that is the fraction of particles that occupy state {{mvar|i}}. :<math>p_i = \frac{N_i}{N}</math> where {{mvar|N<sub>i</sub>}} is the number of particles in state {{mvar|i}} and {{mvar|N}} is the total number of particles in the system. We may use the Boltzmann distribution to find this probability that is, as we have seen, equal to the fraction of particles that are in state i. So the equation that gives the fraction of particles in state {{mvar|i}} as a function of the energy of that state is <ref name="Atkins, P. W. 2010"/> <math display=block> \frac{N_i}{N} = \frac{ \exp\left(- \frac{\varepsilon_i}{kT} \right) }{ \displaystyle \sum_{j=1}^{M} \exp\left(- \tfrac{\varepsilon_j}{kT} \right) } </math> This equation is of great importance to [[spectroscopy]]. In spectroscopy we observe a [[spectral line]] of atoms or molecules undergoing transitions from one state to another.<ref name="Atkins, P. W. 2010"/><ref>{{cite book |last1=Atkins |first1=P. W. |last2=de Paula |first2=J. |year=2009 |title=Physical Chemistry |edition=9th |publisher=Oxford University Press |location=Oxford |isbn=978-0-19-954337-3 }}</ref> In order for this to be possible, there must be some particles in the first state to undergo the transition. We may find that this condition is fulfilled by finding the fraction of particles in the first state. If it is negligible, the transition is very likely not observed at the temperature for which the calculation was done. In general, a larger fraction of molecules in the first state means a higher number of transitions to the second state.<ref>{{cite book |last1=Skoog |first1=D. A. |last2=Holler |first2=F. J. |last3=Crouch |first3=S. R. |year=2006 |title=Principles of Instrumental Analysis |publisher=Brooks/Cole |location=Boston, MA |isbn=978-0-495-12570-9 }}</ref> This gives a stronger spectral line. However, there are other factors that influence the intensity of a spectral line, such as whether it is caused by an allowed or a [[forbidden transition]]. The [[softmax function]] commonly used in machine learning is related to the Boltzmann distribution: :<math> (p_1, \ldots, p_M) = \operatorname{softmax} \left[- \frac{\varepsilon_1}{kT}, \ldots, - \frac{\varepsilon_M}{kT} \right] </math> == Generalized Boltzmann distribution == A distribution of the form :<math>\Pr\left(\omega\right)\propto\exp\left[\sum_{\eta=1}^{n}\frac{X_{\eta}x_{\eta}^{\left(\omega\right)}}{k_{B}T}-\frac{E^{\left(\omega\right)}}{k_{B}T}\right]</math> is called '''generalized Boltzmann distribution''' by some authors.<ref name="Gao2019">{{cite journal |last1= Gao |first1= Xiang |last2= Gallicchio |first2= Emilio |first3= Adrian |last3= Roitberg |date= 2019 |title= The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy |url= https://aip.scitation.org/doi/abs/10.1063/1.5111333|journal= The Journal of Chemical Physics|volume= 151|issue= 3|pages= 034113|doi= 10.1063/1.5111333|pmid= 31325924 |arxiv= 1903.02121 |bibcode= 2019JChPh.151c4113G |s2cid= 118981017 |access-date= }}</ref> The Boltzmann distribution is a special case of the generalized Boltzmann distribution. The generalized Boltzmann distribution is used in statistical mechanics to describe [[canonical ensemble]], [[grand canonical ensemble]] and [[isothermal–isobaric ensemble]]. The generalized Boltzmann distribution is usually derived from the [[principle of maximum entropy]], but there are other derivations.<ref name="Gao2019" /><ref name="Gao2022">{{cite journal |last1= Gao |first1= Xiang |date= March 2022 |title= The Mathematics of the Ensemble Theory |url= https://www.sciencedirect.com/science/article/pii/S2211379722000390|journal= Results in Physics|volume= 34|pages= 105230|doi= 10.1016/j.rinp.2022.105230 |bibcode= 2022ResPh..3405230G |s2cid= 221978379 |arxiv= 2006.00485 }}</ref> The generalized Boltzmann distribution has the following properties: * It is the only distribution for which the entropy as defined by [[Entropy (statistical thermodynamics)#Gibbs entropy formula|Gibbs entropy formula]] matches with the entropy as defined in [[Entropy (classical thermodynamics)|classical thermodynamics]].<ref name="Gao2019" /> * It is the only distribution that is mathematically consistent with the [[fundamental thermodynamic relation]] where state functions are described by ensemble average.<ref name="Gao2022" /> == In statistical mechanics == {{main|Canonical ensemble|Maxwell–Boltzmann statistics}} The Boltzmann distribution appears in [[statistical mechanics]] when considering closed systems of fixed composition that are in [[thermal equilibrium]] (equilibrium with respect to energy exchange). The most general case is the probability distribution for the canonical ensemble. Some special cases (derivable from the canonical ensemble) show the Boltzmann distribution in different aspects: ; [[Canonical ensemble]] (general case) : The [[canonical ensemble]] gives the [[probabilities]] of the various possible states of a closed system of fixed volume, in thermal equilibrium with a [[heat bath]]. The canonical ensemble has a state probability distribution with the Boltzmann form. ; Statistical frequencies of subsystems' states (in a non-interacting collection) : When the system of interest is a collection of many non-interacting copies of a smaller subsystem, it is sometimes useful to find the [[statistical frequency]] of a given subsystem state, among the collection. The canonical ensemble has the property of separability when applied to such a collection: as long as the non-interacting subsystems have fixed composition, then each subsystem's state is independent of the others and is also characterized by a canonical ensemble. As a result, the [[expectation value|expected]] statistical frequency distribution of subsystem states has the Boltzmann form. ; [[Maxwell–Boltzmann statistics]] of classical gases (systems of non-interacting particles) : In particle systems, many particles share the same space and regularly change places with each other; the single-particle state space they occupy is a shared space. [[Maxwell–Boltzmann statistics]] give the expected number of particles found in a given single-particle state, in a [[classical mechanics|classical]] gas of non-interacting particles at equilibrium. This expected number distribution has the Boltzmann form. Although these cases have strong similarities, it is helpful to distinguish them as they generalize in different ways when the crucial assumptions are changed: * When a system is in thermodynamic equilibrium with respect to both energy exchange ''and particle exchange'', the requirement of fixed composition is relaxed and a [[grand canonical ensemble]] is obtained rather than canonical ensemble. On the other hand, if both composition and energy are fixed, then a [[microcanonical ensemble]] applies instead. * If the subsystems within a collection ''do'' interact with each other, then the expected frequencies of subsystem states no longer follow a Boltzmann distribution, and even may not have an [[analytical solution]].<ref>A classic example of this is [[magnetic ordering]]. Systems of non-interacting [[Spin (physics)|spins]] show [[paramagnetic]] behaviour that can be understood with a single-particle canonical ensemble (resulting in the [[Brillouin function]]). Systems of ''interacting'' spins can show much more complex behaviour such as [[ferromagnetism]] or [[antiferromagnetism]].</ref> The canonical ensemble can however still be applied to the ''collective'' states of the entire system considered as a whole, provided the entire system is in thermal equilibrium. * With ''[[quantum mechanics|quantum]]'' gases of non-interacting particles in equilibrium, the number of particles found in a given single-particle state does not follow Maxwell–Boltzmann statistics, and there is no simple closed form expression for quantum gases in the canonical ensemble. In the grand canonical ensemble the state-filling statistics of quantum gases are described by [[Fermi–Dirac statistics]] or [[Bose–Einstein statistics]], depending on whether the particles are [[fermion]]s or [[boson]]s, respectively. == In mathematics == {{main|Gibbs measure|Log-linear model|Boltzmann machine}} * In more general mathematical settings, the Boltzmann distribution is also known as the [[Gibbs measure]]. * In statistics and [[machine learning]], it is called a [[log-linear model]]. * In [[deep learning]], the Boltzmann distribution is used in the [[sampling distribution]] of [[stochastic neural network]]s such as the [[Boltzmann machine]], [[restricted Boltzmann machine]], [[Energy based model|energy-based models]] and [[Deep Boltzmann Machine|deep Boltzmann machine]]. In deep learning, the [[Boltzmann machine]] is considered to be one of the [[unsupervised learning]] models. In the design of [[Boltzmann machine]] in deep learning, as the number of nodes are increased the difficulty of implementing in real time applications becomes critical, so a different type of architecture named [[Restricted Boltzmann machine]] is introduced. == In economics == The Boltzmann distribution can be introduced to allocate permits in [[emissions trading]].<ref name="Park, J.-W. 2012">Park, J.-W., Kim, C. U. and Isard, W. (2012) Permit allocation in emissions trading using the Boltzmann distribution. Physica A 391: 4883–4890</ref><ref>[http://www.technologyreview.com/view/425051/the-thorny-problem-of-fair-allocation/ The Thorny Problem Of Fair Allocation]. ''Technology Review'' blog. August 17, 2011. Cites and summarizes Park, Kim and Isard (2012).</ref> The new allocation method using the Boltzmann distribution can describe the most probable, natural, and unbiased distribution of emissions permits among multiple countries. The Boltzmann distribution has the same form as the [[Multinomial logistic regression|multinomial logit]] model. As a [[discrete choice]] model, this is very well known in economics since [[Daniel McFadden]] made the connection to random utility maximization.<ref>{{cite book|ref=none |last=Amemiya |first=Takeshi |chapter=Multinomial Logit Model |title=Advanced Econometrics |year=1985 |publisher=Basil Blackwell |location=Oxford |isbn=0-631-13345-3 |pages=295–299 |chapter-url=https://books.google.com/books?id=0bzGQE14CwEC&pg=PA296 }}</ref> ==See also== *[[Bose–Einstein statistics]] *[[Fermi–Dirac statistics]] *[[Negative temperature]] *[[Softmax function]] == References == {{reflist|30em}} {{Probability distributions}} [[Category:Statistical mechanics]] [[Category:Ludwig Boltzmann|Distribution]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:About
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite web
(
edit
)
Template:Main
(
edit
)
Template:Math
(
edit
)
Template:Mvar
(
edit
)
Template:Probability distributions
(
edit
)
Template:R
(
edit
)
Template:Reflist
(
edit
)
Template:Section link
(
edit
)
Template:Short description
(
edit
)
Template:Tmath
(
edit
)
Template:Use American English
(
edit
)