Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Statistical mechanics
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Fundamental postulate === A [[sufficient condition|sufficient]] (but not necessary) condition for statistical equilibrium with an isolated system is that the probability distribution is a function only of conserved properties (total energy, total particle numbers, etc.).<ref name="gibbs" /> There are many different equilibrium ensembles that can be considered, and only some of them correspond to thermodynamics.<ref name="gibbs" /> Additional postulates are necessary to motivate why the ensemble for a given system should have one form or another. A common approach found in many textbooks is to take the ''equal a priori probability postulate''.<ref name="tolman"/> This postulate states that : ''For an isolated system with an exactly known energy and exactly known composition, the system can be found with ''equal probability'' in any [[microstate (statistical mechanics)|microstate]] consistent with that knowledge.'' The equal a priori probability postulate therefore provides a motivation for the [[microcanonical ensemble]] described below. There are various arguments in favour of the equal a priori probability postulate: * [[Ergodic hypothesis]]: An ergodic system is one that evolves over time to explore "all accessible" states: all those with the same energy and composition. In an ergodic system, the microcanonical ensemble is the only possible equilibrium ensemble with fixed energy. This approach has limited applicability, since most systems are not ergodic. * [[Principle of indifference]]: In the absence of any further information, we can only assign equal probabilities to each compatible situation. * [[Maximum entropy thermodynamics|Maximum information entropy]]: A more elaborate version of the principle of indifference states that the correct ensemble is the ensemble that is compatible with the known information and that has the largest [[Gibbs entropy]] ([[information entropy]]).<ref>{{cite journal | last = Jaynes | first = E.| author-link = Edwin Thompson Jaynes | title = Information Theory and Statistical Mechanics | doi = 10.1103/PhysRev.106.620 | journal = Physical Review | volume = 106 | issue = 4 | pages = 620β630 | year = 1957 |bibcode = 1957PhRv..106..620J }}</ref> Other fundamental postulates for statistical mechanics have also been proposed.<ref name="uffink"/><ref name="Gao2019" /><ref name="Gao2022" /> For example, recent studies shows that the theory of statistical mechanics can be built without the equal a priori probability postulate.<ref name="Gao2019">{{cite journal |last1=Gao |first1=Xiang |last2=Gallicchio |first2=Emilio |last3=Roitberg |first3=Adrian E. |title=The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy |journal=The Journal of Chemical Physics |date=21 July 2019 |volume=151 |issue=3 |page=034113 |doi=10.1063/1.5111333 |pmid=31325924 |arxiv=1903.02121 |bibcode=2019JChPh.151c4113G }}</ref><ref name="Gao2022">{{cite journal |last1= Gao |first1= Xiang |date= March 2022 |title= The Mathematics of the Ensemble Theory |journal= Results in Physics|volume= 34|pages= 105230|doi= 10.1016/j.rinp.2022.105230 |bibcode= 2022ResPh..3405230G |s2cid= 221978379 |doi-access= free |arxiv= 2006.00485 }}</ref> One such formalism is based on the [[fundamental thermodynamic relation]] together with the following set of postulates:<ref name="Gao2019" /> {{ordered list | The probability density function is proportional to some function of the ensemble parameters and random variables. | Thermodynamic state functions are described by ensemble averages of random variables. | The entropy as defined by [[Entropy_(statistical_thermodynamics)#Gibbs entropy formula|Gibbs entropy formula]] matches with the entropy as defined in [[Entropy (classical thermodynamics)|classical thermodynamics]]. }} where the third postulate can be replaced by the following:<ref name="Gao2022" /> {{ordered list|start=3 | At infinite temperature, all the microstates have the same probability. }}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)