Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Statistical mechanics
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Physics of many interacting particles}} {{use mdy dates|date=January 2019}} {{Statistical mechanics }} In [[physics]], '''statistical mechanics''' is a mathematical framework that applies [[Statistics|statistical methods]] and [[probability theory]] to large assemblies of microscopic entities. Sometimes called '''statistical physics''' or '''statistical thermodynamics''', its applications include many problems in a wide variety of fields such as [[biology]],<ref>{{cite journal |last1=Teschendorff |first1=Andrew E. |last2=Feinberg |first2=Andrew P. |title=Statistical mechanics meets single-cell biology |journal=Nature Reviews Genetics |date=July 2021 |volume=22 |issue=7 |pages=459â476 |doi=10.1038/s41576-021-00341-z |pmid=33875884 |pmc=10152720 }}</ref> [[neuroscience]],<ref>{{cite journal |last1=Advani |first1=Madhu |last2=Lahiri |first2=Subhaneil |last3=Ganguli |first3=Surya |title=Statistical mechanics of complex neural systems and high dimensional data |journal=Journal of Statistical Mechanics: Theory and Experiment |date=12 March 2013 |volume=2013 |issue=3 |pages=P03014 |doi=10.1088/1742-5468/2013/03/P03014 |arxiv=1301.7115 |bibcode=2013JSMTE..03..014A }}</ref> [[computer science]],<ref>{{cite book |doi=10.1007/978-981-16-7570-6 |title=Statistical Mechanics of Neural Networks |date=2021 |last1=Huang |first1=Haiping |isbn=978-981-16-7569-0 }}</ref><ref>{{cite journal |last1=Berger |first1=Adam L. |last2=Pietra |first2=Vincent J. Della |last3=Pietra |first3=Stephen A. Della |title=A maximum entropy approach to natural language processing |journal=Computational Linguistics |date=March 1996 |volume=22 |issue=1 |pages=39â71 |id={{INIST|3283782}} |url=https://aclanthology.org/J96-1002.pdf }}</ref> [[information theory]]<ref>{{cite journal |last1=Jaynes |first1=E. T. |title=Information Theory and Statistical Mechanics |journal=Physical Review |date=15 May 1957 |volume=106 |issue=4 |pages=620â630 |doi=10.1103/PhysRev.106.620 |bibcode=1957PhRv..106..620J }}</ref> and [[sociology]].<ref>{{cite journal |last1=Durlauf |first1=Steven N. |title=How can statistical mechanics contribute to social science? |journal=Proceedings of the National Academy of Sciences |date=14 September 1999 |volume=96 |issue=19 |pages=10582â10584 |doi=10.1073/pnas.96.19.10582 |doi-access=free |pmid=10485867 |pmc=33748 |bibcode=1999PNAS...9610582D }}</ref> Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion.<ref>{{cite book|title = Introduction to Statistical Physics |last = Huang |first = Kerson |publisher= CRC Press| isbn = 978-1-4200-7902-9 |page=15 |edition = 2nd|date = 2009-09-21 }}</ref><ref>{{Cite book |last=Germano |first=R. |title=FĂsica EstatĂstica do EquilĂbrio: um curso introdutĂłrio |publisher=CiĂȘncia Moderna |year=2022 |isbn=978-65-5842-144-3 |location=Rio de Janeiro |page=156 |language=Portuguese}}</ref> Statistical mechanics arose out of the development of [[classical thermodynamics]], a field for which it was successful in explaining macroscopic physical propertiesâsuch as [[temperature]], [[pressure]], and [[heat capacity]]âin terms of microscopic parameters that fluctuate about average values and are characterized by [[probability distribution]]s.<ref name="Reif">{{cite book |last=Reif |first=Frederick |title=Fundamentals of Statistical and Thermal Physics |publisher=McGrawâHill |year=1965 |isbn=978-0-07-051800-1 |pages=651 |url=https://books.google.com/books?id=ObsbAAAAQBAJ }}</ref>{{rp|1-4}} While classical thermodynamics is primarily concerned with [[thermodynamic equilibrium]], statistical mechanics has been applied in [[non-equilibrium statistical mechanics]] to the issues of microscopically modeling the speed of [[irreversible process]]es that are driven by imbalances.<ref name="Reif" />{{rp|3}} Examples of such processes include [[chemical reaction]]s and flows of particles and heat. The [[fluctuationâdissipation theorem]] is the basic knowledge obtained from applying non-equilibrium statistical mechanics to study the simplest non-equilibrium situation of a steady state current flow in a system of many particles.<ref name="Reif" />{{rp|572-573}} == History == In 1738, Swiss physicist and mathematician [[Daniel Bernoulli]] published ''Hydrodynamica'' which laid the basis for the [[kinetic theory of gases]]. In this work, Bernoulli posited the argument, still used to this day, that gases consist of great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that what we experience as [[heat]] is simply the kinetic energy of their motion.<ref name="uffink"/> The founding of the field of statistical mechanics is generally credited to three physicists: *[[Ludwig Boltzmann]], who developed the fundamental interpretation of [[entropy]] in terms of a collection of microstates *[[James Clerk Maxwell]], who developed models of probability distribution of such states *[[Josiah Willard Gibbs]], who coined the name of the field in 1884 In 1859, after reading a paper on the diffusion of molecules by [[Rudolf Clausius]], Scottish physicist [[James Clerk Maxwell]] formulated the [[Maxwell distribution]] of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range.<ref>See: * Maxwell, J.C. (1860) [https://books.google.com/books?id=-YU7AQAAMAAJ&pg=PA19 "Illustrations of the dynamical theory of gases. Part I. On the motions and collisions of perfectly elastic spheres,"] ''Philosophical Magazine'', 4th series, '''19''' : 19â32. * Maxwell, J.C. (1860) [https://books.google.com/books?id=DIc7AQAAMAAJ&pg=PA21 "Illustrations of the dynamical theory of gases. Part II. On the process of diffusion of two or more kinds of moving particles among one another,"] ''Philosophical Magazine'', 4th series, '''20''' : 21â37.</ref> This was the first-ever statistical law in physics.<ref>{{cite book |last = Mahon |first = Basil |title=The Man Who Changed Everything â the Life of James Clerk Maxwell |location=Hoboken, NJ |publisher=Wiley |year=2003 |isbn=978-0-470-86171-4 |oclc=52358254}}</ref> Maxwell also gave the first mechanical argument that molecular collisions entail an equalization of temperatures and hence a tendency towards equilibrium.<ref>{{cite journal | last = Gyenis | first = Balazs | doi = 10.1016/j.shpsb.2017.01.001 | title = Maxwell and the normal distribution: A colored story of probability, independence, and tendency towards equilibrium | journal = Studies in History and Philosophy of Modern Physics | volume = 57 | pages = 53â65 | year = 2017| arxiv = 1702.01411 | bibcode = 2017SHPMP..57...53G | s2cid = 38272381 }}</ref> Five years later, in 1864, [[Ludwig Boltzmann]], a young student in Vienna, came across Maxwell's paper and spent much of his life developing the subject further. Statistical mechanics was initiated in the 1870s with the work of Boltzmann, much of which was collectively published in his 1896 ''Lectures on Gas Theory''.<ref>{{cite book |doi=10.1142/2012 |title=Statistical Thermodynamics and Stochastic Theory of Nonequilibrium Systems |series=Series on Advances in Statistical Mechanics |date=2005 |volume=8 |bibcode=2005stst.book.....E |isbn=978-981-02-1382-4 |last1=Ebeling |first1=Werner |last2=Sokolov |first2=Igor M. }}</ref> Boltzmann's original papers on the statistical interpretation of thermodynamics, the [[H-theorem]], [[transport theory (statistical physics)|transport theory]], [[thermal equilibrium]], the [[equation of state]] of gases, and similar subjects, occupy about 2,000 pages in the proceedings of the Vienna Academy and other societies. Boltzmann introduced the concept of an equilibrium statistical ensemble and also investigated for the first time non-equilibrium statistical mechanics, with his [[H-theorem|''H''-theorem]]. [[File:Gibbs-Elementary principles in statistical mechanics.png|thumb|Cover of Gibbs' text on statistical mechanics]] The term "statistical mechanics" was coined by the American mathematical physicist [[Josiah Willard Gibbs|J. Willard Gibbs]] in 1884.<ref>{{cite book |first1=J. W. |last1=Gibbs |date=1885 |title=On the Fundamental Formula of Statistical Mechanics, with Applications to Astronomy and Thermodynamics |oclc=702360353 }}</ref> According to Gibbs, the term "statistical", in the context of mechanics, i.e. statistical mechanics, was first used by the Scottish physicist [[James Clerk Maxwell]] in 1871: {{blockquote|text="In dealing with masses of matter, while we do not perceive the individual molecules, we are compelled to adopt what I have described as the statistical method of calculation, and to abandon the strict dynamical method, in which we follow every motion by the calculus."|author=J. Clerk Maxwell<ref>James Clerk Maxwell ,''Theory of Heat'' (London, England: Longmans, Green, and Co., 1871), [https://books.google.com/books?id=DqAAAAAAMAAJ&pg=PA309 p. 309]</ref>}} "Probabilistic mechanics" might today seem a more appropriate term, but "statistical mechanics" is firmly entrenched.<ref>{{cite book |title = The enigma of probability and physics |last=Mayants |first=Lazar |year=1984 |publisher=Springer |isbn=978-90-277-1674-3 |page=174 |url = https://books.google.com/books?id=zmwEfXUdBJ8C&pg=PA174 }}</ref> Shortly before his death, Gibbs published in 1902 ''[[Elementary Principles in Statistical Mechanics]]'', a book which formalized statistical mechanics as a fully general approach to address all mechanical systemsâmacroscopic or microscopic, gaseous or non-gaseous.<ref name="gibbs" /> Gibbs' methods were initially derived in the framework [[classical mechanics]], however they were of such generality that they were found to adapt easily to the later [[quantum mechanics]], and still form the foundation of statistical mechanics to this day.<ref name="tolman" /> == Principles: mechanics and ensembles == {{main|Mechanics|Statistical ensemble (mathematical physics)|l2=Statistical ensemble}} In physics, two types of mechanics are usually examined: [[classical mechanics]] and [[quantum mechanics]]. For both types of mechanics, the standard mathematical approach is to consider two concepts: *The complete state of the mechanical system at a given time, mathematically encoded as a [[phase space|phase point]] (classical mechanics) or a pure [[quantum state vector]] (quantum mechanics). *An equation of motion which carries the state forward in time: [[Hamiltonian mechanics|Hamilton's equations]] (classical mechanics) or the [[Schrödinger equation]] (quantum mechanics) Using these two concepts, the state at any other time, past or future, can in principle be calculated. There is however a disconnect between these laws and everyday life experiences, as we do not find it necessary (nor even theoretically possible) to know exactly at a microscopic level the simultaneous positions and velocities of each molecule while carrying out processes at the human scale (for example, when performing a chemical reaction). Statistical mechanics fills this disconnection between the laws of mechanics and the practical experience of incomplete knowledge, by adding some uncertainty about which state the system is in. Whereas ordinary mechanics only considers the behaviour of a single state, statistical mechanics introduces the [[Statistical ensemble (mathematical physics)|statistical ensemble]], which is a large collection of virtual, independent copies of the system in various states. The statistical ensemble is a [[probability distribution]] over all possible states of the system. In classical statistical mechanics, the ensemble is a probability distribution over phase points (as opposed to a single phase point in ordinary mechanics), usually represented as a distribution in a [[phase space]] with [[canonical coordinates|canonical coordinate]] axes. In quantum statistical mechanics, the ensemble is a probability distribution over pure states and can be compactly summarized as a [[density matrix]]. As is usual for probabilities, the ensemble can be interpreted in different ways:<ref name="gibbs" /> * an ensemble can be taken to represent the various possible states that a ''single system'' could be in ([[epistemic probability]], a form of knowledge), or * the members of the ensemble can be understood as the states of the systems in experiments repeated on independent systems which have been prepared in a similar but imperfectly controlled manner ([[empirical probability]]), in the limit of an infinite number of trials. These two meanings are equivalent for many purposes, and will be used interchangeably in this article. However the probability is interpreted, each state in the ensemble evolves over time according to the equation of motion. Thus, the ensemble itself (the probability distribution over states) also evolves, as the virtual systems in the ensemble continually leave one state and enter another. The ensemble evolution is given by the [[Liouville's theorem (Hamiltonian)|Liouville equation]] (classical mechanics) or the [[von Neumann equation]] (quantum mechanics). These equations are simply derived by the application of the mechanical equation of motion separately to each virtual system contained in the ensemble, with the probability of the virtual system being conserved over time as it evolves from state to state. One special class of ensemble is those ensembles that do not evolve over time. These ensembles are known as ''equilibrium ensembles'' and their condition is known as ''statistical equilibrium''. Statistical equilibrium occurs if, for each state in the ensemble, the ensemble also contains all of its future and past states with probabilities equal to the probability of being in that state. (By contrast, ''[[mechanical equilibrium]]'' is a state with a balance of forces that has ceased to evolve.) The study of equilibrium ensembles of isolated systems is the focus of statistical thermodynamics. Non-equilibrium statistical mechanics addresses the more general case of ensembles that change over time, and/or ensembles of non-isolated systems. == Statistical thermodynamics == The primary goal of statistical thermodynamics (also known as equilibrium statistical mechanics) is to derive the [[classical thermodynamics]] of materials in terms of the properties of their constituent particles and the interactions between them. In other words, statistical thermodynamics provides a connection between the macroscopic properties of materials in [[thermodynamic equilibrium]], and the microscopic behaviours and motions occurring inside the material. Whereas statistical mechanics proper involves dynamics, here the attention is focused on ''statistical equilibrium'' (steady state). Statistical equilibrium does not mean that the particles have stopped moving ([[mechanical equilibrium]]), rather, only that the ensemble is not evolving. === Fundamental postulate === A [[sufficient condition|sufficient]] (but not necessary) condition for statistical equilibrium with an isolated system is that the probability distribution is a function only of conserved properties (total energy, total particle numbers, etc.).<ref name="gibbs" /> There are many different equilibrium ensembles that can be considered, and only some of them correspond to thermodynamics.<ref name="gibbs" /> Additional postulates are necessary to motivate why the ensemble for a given system should have one form or another. A common approach found in many textbooks is to take the ''equal a priori probability postulate''.<ref name="tolman"/> This postulate states that : ''For an isolated system with an exactly known energy and exactly known composition, the system can be found with ''equal probability'' in any [[microstate (statistical mechanics)|microstate]] consistent with that knowledge.'' The equal a priori probability postulate therefore provides a motivation for the [[microcanonical ensemble]] described below. There are various arguments in favour of the equal a priori probability postulate: * [[Ergodic hypothesis]]: An ergodic system is one that evolves over time to explore "all accessible" states: all those with the same energy and composition. In an ergodic system, the microcanonical ensemble is the only possible equilibrium ensemble with fixed energy. This approach has limited applicability, since most systems are not ergodic. * [[Principle of indifference]]: In the absence of any further information, we can only assign equal probabilities to each compatible situation. * [[Maximum entropy thermodynamics|Maximum information entropy]]: A more elaborate version of the principle of indifference states that the correct ensemble is the ensemble that is compatible with the known information and that has the largest [[Gibbs entropy]] ([[information entropy]]).<ref>{{cite journal | last = Jaynes | first = E.| author-link = Edwin Thompson Jaynes | title = Information Theory and Statistical Mechanics | doi = 10.1103/PhysRev.106.620 | journal = Physical Review | volume = 106 | issue = 4 | pages = 620â630 | year = 1957 |bibcode = 1957PhRv..106..620J }}</ref> Other fundamental postulates for statistical mechanics have also been proposed.<ref name="uffink"/><ref name="Gao2019" /><ref name="Gao2022" /> For example, recent studies shows that the theory of statistical mechanics can be built without the equal a priori probability postulate.<ref name="Gao2019">{{cite journal |last1=Gao |first1=Xiang |last2=Gallicchio |first2=Emilio |last3=Roitberg |first3=Adrian E. |title=The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy |journal=The Journal of Chemical Physics |date=21 July 2019 |volume=151 |issue=3 |page=034113 |doi=10.1063/1.5111333 |pmid=31325924 |arxiv=1903.02121 |bibcode=2019JChPh.151c4113G }}</ref><ref name="Gao2022">{{cite journal |last1= Gao |first1= Xiang |date= March 2022 |title= The Mathematics of the Ensemble Theory |journal= Results in Physics|volume= 34|pages= 105230|doi= 10.1016/j.rinp.2022.105230 |bibcode= 2022ResPh..3405230G |s2cid= 221978379 |doi-access= free |arxiv= 2006.00485 }}</ref> One such formalism is based on the [[fundamental thermodynamic relation]] together with the following set of postulates:<ref name="Gao2019" /> {{ordered list | The probability density function is proportional to some function of the ensemble parameters and random variables. | Thermodynamic state functions are described by ensemble averages of random variables. | The entropy as defined by [[Entropy_(statistical_thermodynamics)#Gibbs entropy formula|Gibbs entropy formula]] matches with the entropy as defined in [[Entropy (classical thermodynamics)|classical thermodynamics]]. }} where the third postulate can be replaced by the following:<ref name="Gao2022" /> {{ordered list|start=3 | At infinite temperature, all the microstates have the same probability. }} ===Three thermodynamic ensembles=== {{main|Ensemble (mathematical physics)|Microcanonical ensemble|Canonical ensemble|Grand canonical ensemble}} There are three equilibrium ensembles with a simple form that can be defined for any [[isolated system]] bounded inside a finite volume.<ref name="gibbs"/> These are the most often discussed ensembles in statistical thermodynamics. In the macroscopic limit (defined below) they all correspond to classical thermodynamics. ; [[Microcanonical ensemble]] : describes a system with a precisely given energy and fixed composition (precise number of particles). The microcanonical ensemble contains with equal probability each possible state that is consistent with that energy and composition. ; [[Canonical ensemble]] : describes a system of fixed composition that is in [[thermal equilibrium]] with a [[heat bath]] of a precise [[thermodynamic temperature|temperature]]. The canonical ensemble contains states of varying energy but identical composition; the different states in the ensemble are accorded different probabilities depending on their total energy. ; [[Grand canonical ensemble]] : describes a system with non-fixed composition (uncertain particle numbers) that is in thermal and chemical equilibrium with a thermodynamic reservoir. The reservoir has a precise temperature, and precise [[chemical potential]]s for various types of particle. The grand canonical ensemble contains states of varying energy and varying numbers of particles; the different states in the ensemble are accorded different probabilities depending on their total energy and total particle numbers. For systems containing many particles (the [[thermodynamic limit]]), all three of the ensembles listed above tend to give identical behaviour. It is then simply a matter of mathematical convenience which ensemble is used.<ref name="Reif" />{{rp|227}} The Gibbs theorem about equivalence of ensembles<ref>{{cite journal |doi=10.1007/s10955-015-1212-2|title=Equivalence and Nonequivalence of Ensembles: Thermodynamic, Macrostate, and Measure Levels|journal=Journal of Statistical Physics|volume=159|issue=5|pages=987â1016|year=2015|last1=Touchette|first1=Hugo|arxiv=1403.6608|bibcode=2015JSP...159..987T|s2cid=118534661}}</ref> was developed into the theory of [[concentration of measure]] phenomenon,<ref>{{cite book |doi=10.1090/surv/089 |title=The Concentration of Measure Phenomenon |series=Mathematical Surveys and Monographs |date=2005 |volume=89 |isbn=978-0-8218-3792-4 |url=http://www.gbv.de/dms/bowker/toc/9780821837924.pdf }}{{pn|date=April 2024}}</ref> which has applications in many areas of science, from functional analysis to methods of [[artificial intelligence]] and [[big data]] technology.<ref>{{cite journal |last1=Gorban |first1=A. N. |last2=Tyukin |first2=I. Y. |title=Blessing of dimensionality: mathematical foundations of the statistical physics of data |journal=Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences |date=28 April 2018 |volume=376 |issue=2118 |pages=20170237 |doi=10.1098/rsta.2017.0237 |pmid=29555807 |pmc=5869543 |arxiv=1801.03421 |bibcode=2018RSPTA.37670237G }}</ref> Important cases where the thermodynamic ensembles ''do not'' give identical results include: * Microscopic systems. * Large systems at a phase transition. * Large systems with long-range interactions. In these cases the correct thermodynamic ensemble must be chosen as there are observable differences between these ensembles not just in the size of fluctuations, but also in average quantities such as the distribution of particles. The correct ensemble is that which corresponds to the way the system has been prepared and characterizedâin other words, the ensemble that reflects the knowledge about that system.<ref name="tolman" /> {| class="wikitable" style="text-align: center" |+ Thermodynamic ensembles<ref name="gibbs" /> |- ! ! [[Microcanonical ensemble|Microcanonical]] ! [[Canonical ensemble|Canonical]] ! [[Grand canonical ensemble|Grand canonical]] |- ! Fixed variables | <math>E, N, V</math> | <math>T, N, V</math> | <math>T, \mu, V</math> |- ! rowspan="2" | Microscopic features | Number of [[Microstate (statistical mechanics)|microstates]] | [[Canonical partition function]] | [[Grand partition function]] |- | <math>W</math> | <math>Z = \sum_k e^{- E_k / k_B T}</math> | <math>\mathcal Z = \sum_k e^{ -(E_k - \mu N_k) /k_B T}</math> |- ! rowspan="2" | Macroscopic function | [[Boltzmann entropy]] | [[Helmholtz free energy]] | [[Grand potential]] |- | <math>S = k_B \log W</math> | <math>F = - k_B T \log Z</math> | <math>\Omega =- k_B T \log \mathcal Z </math> |} === Calculation methods === Once the characteristic state function for an ensemble has been calculated for a given system, that system is 'solved' (macroscopic observables can be extracted from the characteristic state function). Calculating the characteristic state function of a thermodynamic ensemble is not necessarily a simple task, however, since it involves considering every possible state of the system. While some hypothetical systems have been exactly solved, the most general (and realistic) case is too complex for an exact solution. Various approaches exist to approximate the true ensemble and allow calculation of average quantities. ====Exact==== There are some cases which allow exact solutions. * For very small microscopic systems, the ensembles can be directly computed by simply enumerating over all possible states of the system (using exact diagonalization in quantum mechanics, or integral over all phase space in classical mechanics). * Some large systems consist of many separable microscopic systems, and each of the subsystems can be analysed independently. Notably, [[ideal gas|idealized gases]] of non-interacting particles have this property, allowing exact derivations of [[MaxwellâBoltzmann statistics]], [[FermiâDirac statistics]], and [[BoseâEinstein statistics]].<ref name="tolman"/> * A few large systems with interaction have been solved. By the use of subtle mathematical techniques, exact solutions have been found for a few [[toy model]]s.<ref>{{cite book | isbn = 978-0-12-083180-7 | title = Exactly solved models in statistical mechanics | last1 = Baxter | first1 = Rodney J. | year = 1982 | publisher = Academic Press Inc. }}{{pn|date=April 2024}}</ref> Some examples include the [[Bethe ansatz]], [[square-lattice Ising model]] in zero field, [[hard hexagon model]]. ====Monte Carlo==== {{main|Monte Carlo method in statistical mechanics}} Although some problems in statistical physics can be solved analytically using approximations and expansions, most current research utilizes the large processing power of modern computers to simulate or approximate solutions. A common approach to statistical problems is to use a [[Monte Carlo simulation]] to yield insight into the properties of a [[complex system]]. Monte Carlo methods are important in [[computational physics]], [[physical chemistry]], and related fields, and have diverse applications including [[medical physics]], where they are used to model radiation transport for radiation dosimetry calculations.<ref>{{cite journal | doi = 10.1088/0031-9155/59/4/R151 | pmid=24486639 | volume=59 | issue=4 | title=GPU-based high-performance computing for radiation therapy | journal=Physics in Medicine and Biology | pages=R151âR182|bibcode = 2014PMB....59R.151J | year=2014 | last1=Jia | first1=Xun | last2=Ziegenhein | first2=Peter | last3=Jiang | first3=Steve B | pmc=4003902 }}</ref><ref>{{cite journal | doi = 10.1088/0031-9155/59/6/R183 | volume=59 | issue=6 | title=Advances in kilovoltage x-ray beam dosimetry | journal=Physics in Medicine and Biology | pages=R183âR231|bibcode = 2014PMB....59R.183H | pmid=24584183 | date=Mar 2014| last1=Hill | first1=R | last2=Healy | first2=B | last3=Holloway | first3=L | last4=Kuncic | first4=Z | last5=Thwaites | first5=D | last6=Baldock | first6=C | s2cid=18082594 }}</ref><ref>{{cite journal | doi = 10.1088/0031-9155/51/13/R17 | pmid=16790908 | volume=51 | issue=13 | title=Fifty years of Monte Carlo simulations for medical physics | journal=Physics in Medicine and Biology | pages=R287âR301|bibcode = 2006PMB....51R.287R | year=2006 | last1=Rogers | first1=D W O | s2cid=12066026 }}</ref> The [[Monte Carlo method]] examines just a few of the possible states of the system, with the states chosen randomly (with a fair weight). As long as these states form a representative sample of the whole set of states of the system, the approximate characteristic function is obtained. As more and more random samples are included, the errors are reduced to an arbitrarily low level. * The [[MetropolisâHastings algorithm]] is a classic Monte Carlo method which was initially used to sample the canonical ensemble. * [[Path integral Monte Carlo]], also used to sample the canonical ensemble. ==== Other ==== * For rarefied non-ideal gases, approaches such as the [[cluster expansion]] use [[perturbation theory]] to include the effect of weak interactions, leading to a [[virial expansion]].<ref name="balescu" /> * For dense fluids, another approximate approach is based on reduced distribution functions, in particular the [[radial distribution function]].<ref name="balescu"/> * [[Molecular dynamics]] computer simulations can be used to calculate [[microcanonical ensemble]] averages, in ergodic systems. With the inclusion of a connection to a stochastic heat bath, they can also model canonical and grand canonical conditions. * Mixed methods involving non-equilibrium statistical mechanical results (see below) may be useful. == Non-equilibrium statistical mechanics == {{see also|Non-equilibrium thermodynamics}} Many physical phenomena involve quasi-thermodynamic processes out of equilibrium, for example: * [[Thermal conduction|heat transport by the internal motions in a material]], driven by a temperature imbalance, * [[Electrical conduction|electric currents carried by the motion of charges in a conductor]], driven by a voltage imbalance, * spontaneous [[chemical reaction]]s driven by a decrease in free energy, * [[friction]], [[dissipation]], [[quantum decoherence]], * systems being pumped by external forces ([[optical pumping]], etc.), * and irreversible processes in general. All of these processes occur over time with characteristic rates. These rates are important in engineering. The field of non-equilibrium statistical mechanics is concerned with understanding these non-equilibrium processes at the microscopic level. (Statistical thermodynamics can only be used to calculate the final result, after the external imbalances have been removed and the ensemble has settled back down to equilibrium.) In principle, non-equilibrium statistical mechanics could be mathematically exact: ensembles for an isolated system evolve over time according to deterministic equations such as [[Liouville's theorem (Hamiltonian)|Liouville's equation]] or its quantum equivalent, the [[von Neumann equation]]. These equations are the result of applying the mechanical equations of motion independently to each state in the ensemble. These ensemble evolution equations inherit much of the complexity of the underlying mechanical motion, and so exact solutions are very difficult to obtain. Moreover, the ensemble evolution equations are fully reversible and do not destroy information (the ensemble's [[Gibbs entropy]] is preserved). In order to make headway in modelling irreversible processes, it is necessary to consider additional factors besides probability and reversible mechanics. Non-equilibrium mechanics is therefore an active area of theoretical research as the range of validity of these additional assumptions continues to be explored. A few approaches are described in the following subsections. === Stochastic methods === One approach to non-equilibrium statistical mechanics is to incorporate [[stochastic]] (random) behaviour into the system. Stochastic behaviour destroys information contained in the ensemble. While this is technically inaccurate (aside from [[Black hole information paradox|hypothetical situations involving black holes]], a system cannot in itself cause loss of information), the randomness is added to reflect that information of interest becomes converted over time into subtle correlations within the system, or to correlations between the system and environment. These correlations appear as [[Chaos theory|chaotic]] or [[pseudorandom]] influences on the variables of interest. By replacing these correlations with randomness proper, the calculations can be made much easier. {{unordered list |1 = ''[[Boltzmann transport equation]]'': An early form of stochastic mechanics appeared even before the term "statistical mechanics" had been coined, in studies of [[kinetic theory of gases|kinetic theory]]. [[James Clerk Maxwell]] had demonstrated that molecular collisions would lead to apparently chaotic motion inside a gas. [[Ludwig Boltzmann]] subsequently showed that, by taking this [[molecular chaos]] for granted as a complete randomization, the motions of particles in a gas would follow a simple [[Boltzmann transport equation]] that would rapidly restore a gas to an equilibrium state (see [[H-theorem]]). The Boltzmann transport equation and related approaches are important tools in non-equilibrium statistical mechanics due to their extreme simplicity. These approximations work well in systems where the "interesting" information is immediately (after just one collision) scrambled up into subtle correlations, which essentially restricts them to rarefied gases. The Boltzmann transport equation has been found to be very useful in simulations of electron transport in lightly doped [[semiconductor]]s (in [[transistor]]s), where the electrons are indeed analogous to a rarefied gas. A quantum technique related in theme is the [[random phase approximation]]. |2 = ''[[BBGKY hierarchy]]'': In liquids and dense gases, it is not valid to immediately discard the correlations between particles after one collision. The [[BBGKY hierarchy]] (BogoliubovâBornâGreenâKirkwoodâYvon hierarchy) gives a method for deriving Boltzmann-type equations but also extending them beyond the dilute gas case, to include correlations after a few collisions. |3 = ''[[Keldysh formalism]]'' (a.k.a. NEGFânon-equilibrium Green functions): A quantum approach to including stochastic dynamics is found in the Keldysh formalism. This approach is often used in electronic [[quantum transport]] calculations. |4 = Stochastic [[Liouville's theorem (Hamiltonian)|Liouville equation]]. }} === Near-equilibrium methods === Another important class of non-equilibrium statistical mechanical models deals with systems that are only very slightly perturbed from equilibrium. With very small perturbations, the response can be analysed in [[linear response theory]]. A remarkable result, as formalized by the [[fluctuationâdissipation theorem]], is that the response of a system when near equilibrium is precisely related to the [[Statistical fluctuations|fluctuations]] that occur when the system is in total equilibrium. Essentially, a system that is slightly away from equilibriumâwhether put there by external forces or by fluctuationsârelaxes towards equilibrium in the same way, since the system cannot tell the difference or "know" how it came to be away from equilibrium.<ref name="balescu"/>{{rp|664}} This provides an indirect avenue for obtaining numbers such as [[Ohm's law|ohmic conductivity]] and [[thermal conductivity]] by extracting results from equilibrium statistical mechanics. Since equilibrium statistical mechanics is mathematically well defined and (in some cases) more amenable for calculations, the fluctuationâdissipation connection can be a convenient shortcut for calculations in near-equilibrium statistical mechanics. A few of the theoretical tools used to make this connection include: * [[Fluctuationâdissipation theorem]] * [[Onsager reciprocal relations]] * [[GreenâKubo relations]] * [[Ballistic conduction#Landauer-BĂŒttiker formalism|LandauerâBĂŒttiker formalism]] * [[MoriâZwanzig formalism]] * [[GENERIC formalism]] === Hybrid methods === An advanced approach uses a combination of stochastic methods and [[linear response theory]]. As an example, one approach to compute quantum coherence effects ([[weak localization]], [[conductance fluctuations]]) in the conductance of an electronic system is the use of the GreenâKubo relations, with the inclusion of stochastic [[dephasing]] by interactions between various electrons by use of the Keldysh method.<ref>{{cite journal |last1=Altshuler |first1=B L |last2=Aronov |first2=A G |last3=Khmelnitsky |first3=D E |title=Effects of electron-electron collisions with small energy transfers on quantum localisation |journal=Journal of Physics C: Solid State Physics |date=30 December 1982 |volume=15 |issue=36 |pages=7367â7386 |doi=10.1088/0022-3719/15/36/018 |bibcode=1982JPhC...15.7367A }}</ref><ref>{{cite journal |last1=Aleiner |first1=I. L. |last2=Blanter |first2=Ya. M. |title=Inelastic scattering time for conductance fluctuations |journal=Physical Review B |date=28 February 2002 |volume=65 |issue=11 |pages=115317 |doi=10.1103/PhysRevB.65.115317 |url=http://resolver.tudelft.nl/uuid:e7736134-6c36-47f4-803f-0fdee5074b5a |arxiv=cond-mat/0105436 |bibcode=2002PhRvB..65k5317A }}</ref> ==Applications== The ensemble formalism can be used to analyze general mechanical systems with uncertainty in knowledge about the state of a system. Ensembles are also used in: * [[propagation of uncertainty]] over time,<ref name="gibbs"/> * [[regression analysis]] of gravitational [[orbit]]s, * [[ensemble forecasting]] of weather, * dynamics of [[neural networks]], * bounded-rational [[potential game]]s in [[game theory]] and [[non-equilibrium economics]]. Statistical physics explains and quantitatively describes [[superconductivity]], [[superfluidity]], [[turbulence]], collective phenomena in [[solid]]s and [[plasma (physics)|plasma]], and the structural features of [[liquid]]. It underlies the modern [[astrophysics]] and [[virial theorem]]. In solid state physics, statistical physics aids the study of [[liquid crystals]], [[phase transition]]s, and [[critical phenomena]]. Many experimental studies of matter are entirely based on the statistical description of a system. These include the scattering of cold [[neutron]]s, [[X-ray]], [[Visible radiation|visible light]], and more. Statistical physics also plays a role in materials science, nuclear physics, astrophysics, chemistry, biology and medicine (e.g. study of the spread of infectious diseases).{{fact|date=April 2024}} Analytical and computational techniques derived from statistical physics of disordered systems, can be extended to large-scale problems, including machine learning, e.g., to analyze the weight space of deep [[neural networks]].<ref>{{cite journal |last1=Ramezanpour |first1=Abolfazl |last2=Beam |first2=Andrew L. |last3=Chen |first3=Jonathan H. |last4=Mashaghi |first4=Alireza |title=Statistical Physics for Medical Diagnostics: Learning, Inference, and Optimization Algorithms |journal=Diagnostics |date=19 November 2020 |volume=10 |issue=11 |pages=972 |doi=10.3390/diagnostics10110972 |doi-access=free |pmid=33228143 |pmc=7699346 }}</ref> Statistical physics is thus finding applications in the area of [[medical diagnostics]].<ref>{{cite journal |last1=Mashaghi |first1=Alireza |last2=Ramezanpour |first2=Abolfazl |title=Statistical physics of medical diagnostics: Study of a probabilistic model |journal=Physical Review E |date=16 March 2018 |volume=97 |issue=3 |page=032118 |doi=10.1103/PhysRevE.97.032118 |pmid=29776109 |arxiv=1803.10019 |bibcode=2018PhRvE..97c2118M }}</ref> ===Quantum statistical mechanics=== {{main|Quantum statistical mechanics}} [[Quantum statistical mechanics]] is [[statistical mechanics]] applied to [[quantum mechanics|quantum mechanical systems]]. In quantum mechanics, a [[statistical ensemble (mathematical physics)|statistical ensemble]] (probability distribution over possible [[quantum state]]s) is described by a [[density matrix|density operator]] ''S'', which is a non-negative, [[self-adjoint]], [[trace-class]] operator of trace 1 on the [[Hilbert space]] ''H'' describing the quantum system. This can be shown under various [[mathematical formulation of quantum mechanics|mathematical formalisms for quantum mechanics]]. One such formalism is provided by [[quantum logic]].{{fact|date=April 2024}} == Index of statistical mechanics topics == ===Physics=== * [[Probability amplitude]] * [[Statistical physics]] * [[Boltzmann factor]] * [[FeynmanâKac formula]] * [[Fluctuation theorem]] * [[Information entropy]] * [[Vacuum expectation value]] * [[Cosmic variance]] * [[Negative probability]] * [[Gibbs state]] * [[Master equation]] * [[Partition function (mathematics)]] * [[Quantum probability]] ===Percolation theory=== * [[Percolation theory]] * [[SchrammâLoewner evolution]] == See also == * [[List of textbooks in thermodynamics and statistical mechanics]] * {{section link|Laplace transform#Statistical mechanics}} == References == {{Reflist |refs = <ref name="gibbs">{{cite book |last=Gibbs |first=Josiah Willard |author-link=Josiah Willard Gibbs |title=Elementary Principles in Statistical Mechanics |year=1902 |publisher=[[Charles Scribner's Sons]] |location=New York |title-link=Elementary Principles in Statistical Mechanics }}</ref> <ref name="tolman">{{cite book |last1=Tolman |first1=Richard Chace |title=The Principles of Statistical Mechanics |date=1979 |publisher=Courier Corporation |isbn=978-0-486-63896-6 }}{{pn|date=April 2024}}</ref> <ref name="balescu">{{cite book |last1=Balescu |first1=Radu |title=Equilibrium and Non-Equilibrium Statistical Mechanics |date=1975 |publisher=Wiley |isbn=978-0-471-04600-4 }}{{pn|date=April 2024}}</ref> <ref name="uffink">{{cite report |type=Preprint |last1=Uffink |first1=Jos |title=Compendium of the foundations of classical statistical physics |date=March 2006 |url=https://philsci-archive.pitt.edu/2691/ }}</ref> }} ==Further reading== *{{cite book |first=F. |last=Reif |title=Fundamentals of Statistical and Thermal Physics |date=2009 |publisher=Waveland Press |isbn=978-1-4786-1005-2}} *{{cite book |doi=10.1142/8709 |title=Basics of Statistical Physics |date=2013 |last1=MĂŒller-Kirsten |first1=Harald J W. |isbn=978-981-4449-53-3 |url=http://www.gbv.de/dms/tib-ub-hannover/741344904.pdf }} *{{cite web |title=Statistical Physics and other resources |url=https://jfi.uchicago.edu/~leop/ |first=Leo P. |last=Kadanoff |access-date=June 18, 2023 |archive-date=August 12, 2021 |archive-url=https://web.archive.org/web/20210812221245/https://jfi.uchicago.edu/~leop/ |url-status=dead }} *{{cite book |first=Leo P. |last=Kadanoff |title=Statistical Physics: Statics, Dynamics and Renormalization |year=2000 |publisher=World Scientific |isbn=978-981-02-3764-6}} *{{cite arXiv |title=History and outlook of statistical physics |first=Dieter |last=Flamm |year=1998 |eprint=physics/9803005}} == External links == {{Commons category|Statistical mechanics}} * [http://plato.stanford.edu/entries/statphys-statmech/ Philosophy of Statistical Mechanics] article by Lawrence Sklar for the [[Stanford Encyclopedia of Philosophy]]. * [http://www.sklogwiki.org/ Sklogwiki - Thermodynamics, statistical mechanics, and the computer simulation of materials.] SklogWiki is particularly orientated towards liquids and soft condensed matter. * [http://farside.ph.utexas.edu/teaching/sm1/statmech.pdf Thermodynamics and Statistical Mechanics] by Richard Fitzpatrick * {{cite arXiv |eprint=1107.0568 |last1=Cohen |first1=Doron |date=2011 |title=Lecture Notes in Statistical Mechanics and Mesoscopics |class=quant-ph }} * {{YouTube |id = H1Zbp6__uNw&list=PLB72416C707D85AB0&index=1 |title = Videos of lecture series in statistical mechanics }} taught by [[Leonard Susskind]]. * Vu-Quoc, L., [http://clesm.mae.ufl.edu/wiki.pub/index.php/Configuration_integral_%28statistical_mechanics%29 Configuration integral (statistical mechanics)], 2008. this wiki site is down; see [https://web.archive.org/web/20120428193950/http://clesm.mae.ufl.edu/wiki.pub/index.php/Configuration_integral_%28statistical_mechanics%29 this article in the web archive on 2012 April 28]. {{-}} {{Statistical mechanics topics}} {{Physics-footer}} {{Authority control}} [[Category:Statistical mechanics]] [[Category:Thermodynamics]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:-
(
edit
)
Template:Authority control
(
edit
)
Template:Blockquote
(
edit
)
Template:Cite arXiv
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite web
(
edit
)
Template:Clear
(
edit
)
Template:Comma separated entries
(
edit
)
Template:Commons category
(
edit
)
Template:Fact
(
edit
)
Template:Main
(
edit
)
Template:Main other
(
edit
)
Template:Ordered list
(
edit
)
Template:Physics-footer
(
edit
)
Template:Pn
(
edit
)
Template:Reflist
(
edit
)
Template:Rp
(
edit
)
Template:Section link
(
edit
)
Template:See also
(
edit
)
Template:Short description
(
edit
)
Template:Sister project
(
edit
)
Template:Statistical mechanics
(
edit
)
Template:Statistical mechanics topics
(
edit
)
Template:Unordered list
(
edit
)
Template:Use mdy dates
(
edit
)
Template:YouTube
(
edit
)