Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Equivalence of definitions === Proofs of equivalence between the entropy in statistical mechanics — the [[Entropy (statistical thermodynamics)#Gibbs entropy formula|Gibbs entropy formula]]:<math display="block">S = - k_\mathsf{B} \sum_i{p_i \ln{p_i}}</math>and the entropy in classical thermodynamics:<math display="block">\mathrm{d} S = \frac{\delta Q_\mathsf{rev}}{T}</math>together with the [[fundamental thermodynamic relation]] are known for the [[microcanonical ensemble]], the [[canonical ensemble]], the [[grand canonical ensemble]], and the [[isothermal–isobaric ensemble]]. These proofs are based on the probability density of microstates of the generalised [[Boltzmann distribution]] and the identification of the thermodynamic internal energy as the ensemble average <math display="inline">U = \left\langle E_i \right\rangle </math>.<ref>{{cite book |last= Callen|first= Herbert|date= 2001|title= Thermodynamics and an Introduction to Thermostatistics (2nd ed.)|publisher= John Wiley and Sons|isbn= 978-0-471-86256-7}}</ref> Thermodynamic relations are then employed to derive the well-known [[Gibbs entropy formula]]. However, the equivalence between the Gibbs entropy formula and the thermodynamic definition of entropy is not a fundamental thermodynamic relation but rather a consequence of the form of the [[Boltzmann distribution#Generalized Boltzmann distribution|generalized Boltzmann distribution]].<ref>{{cite journal |last1= Gao |first1= Xiang |last2= Gallicchio |first2= Emilio |first3= Adrian |last3= Roitberg |year= 2019 |title= The generalized Boltzmann distribution is the only distribution in which the Gibbs-Shannon entropy equals the thermodynamic entropy |journal= The Journal of Chemical Physics|volume= 151|issue= 3|pages= 034113|doi= 10.1063/1.5111333|pmid= 31325924 |arxiv= 1903.02121 |bibcode= 2019JChPh.151c4113G |s2cid= 118981017 }}</ref> Furthermore, it has been shown that the definitions of entropy in statistical mechanics is the only entropy that is equivalent to the classical thermodynamics entropy under the following postulates:<ref name="Gao2022">{{cite journal |last1= Gao |first1= Xiang |date= March 2022 |title= The Mathematics of the Ensemble Theory |journal= Results in Physics|volume= 34|pages= 105230|doi= 10.1016/j.rinp.2022.105230 |bibcode= 2022ResPh..3405230G |s2cid= 221978379 |doi-access= free |arxiv= 2006.00485 }}</ref> {{ordered list | The probability density function is proportional to some function of the ensemble parameters and random variables. | Thermodynamic state functions are described by ensemble averages of random variables. | At infinite temperature, all the microstates have the same probability. }}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)