Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Property of a thermodynamic system}} {{Hatnote group| {{Other uses}} {{For introduction}} {{Distinguish|Enthalpy}} }} {{Use Oxford spelling|date=November 2024}} {{Use dmy dates|date=November 2024}} {{Infobox physical quantity | name =Entropy | width = | background = | image = | caption = | unit =joules per kelvin (Jβ K<sup>β1</sup>) | otherunits = | symbols = ''S'' | baseunits = kgβ m<sup>2</sup>β s<sup>β2</sup>β K<sup>β1</sup> | dimension = | extensive = | intensive = | conserved = | transformsas = | derivations = }} {{Thermodynamics sidebar|expanded=sysprop}} {{EntropySegments}} {{Modern physics}} {{Complex systems}} '''Entropy''' is a [[Science|scientific]] concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from [[classical thermodynamics]], where it was first recognized, to the microscopic description of nature in [[statistical physics]], and to the principles of [[information theory]]. It has found far-ranging applications in [[chemistry]] and [[physics]], in biological systems and their relation to life, in [[cosmology]], [[economics]], [[sociology]], [[Atmospheric science|weather science]], [[climate change]] and [[information system]]s including the transmission of information in [[Telecommunications|telecommunication]].<ref>{{Cite journal|last=Wehrl|first=Alfred|date=1 April 1978|title=General properties of entropy|url=https://link.aps.org/doi/10.1103/RevModPhys.50.221|journal=Reviews of Modern Physics|volume=50|issue=2|pages=221β260|doi=10.1103/RevModPhys.50.221|bibcode=1978RvMP...50..221W}}</ref> Entropy is central to the [[second law of thermodynamics]], which states that the entropy of an isolated system left to spontaneous evolution cannot decrease with time. As a result, isolated systems evolve toward [[thermodynamic equilibrium]], where the entropy is highest. A consequence of the second law of thermodynamics is that certain processes are [[Irreversible process|irreversible]]. The thermodynamic concept was referred to by Scottish scientist and engineer [[William Rankine]] in 1850 with the names ''thermodynamic function'' and ''heat-potential''.<ref>{{cite book |last=Truesdell |first=C. |author-link=Clifford Truesdell |date=1980 |title=The Tragicomical History of Thermodynamics, 1822β1854 |url=https://archive.org/details/tragicomicalhist0000unse |url-access=registration |location=New York |publisher=Springer-Verlag |isbn=0387904034 |page=215 |via=[[Internet Archive]]}}</ref> In 1865, German physicist [[Rudolf Clausius]], one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of [[heat]] to the instantaneous [[temperature]]. He initially described it as ''transformation-content'', in German ''Verwandlungsinhalt'', and later coined the term ''entropy'' from a Greek word for ''transformation''.<ref name=brush1976>[[Stephen G. Brush|Brush, S.G.]] (1976). ''The Kind of Motion We Call Heat: a History of the Kinetic Theory of Gases in the 19th Century, Book 2, Statistical Physics and Irreversible Processes'', Elsevier, Amsterdam, {{ISBN|0-444-87009-1}}, pp. 576β577.</ref> Austrian physicist [[Ludwig Boltzmann]] explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and [[probability distribution]]s into a new field of thermodynamics, called [[statistical mechanics]], and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behaviour, in form of a simple [[logarithm]]ic law, with a [[proportionality (mathematics)|proportionality constant]], the [[Boltzmann constant]], which has become one of the defining universal constants for the modern [[International System of Units]].<ref>{{cite journal |last=Jagannathan |first=Kannan |year=2019 |title=Anxiety and the Equation: Understanding Boltzmann's Entropy |journal=American Journal of Physics |volume=87 |issue=9 |pages=765 |doi=10.1119/1.5116583|bibcode=2019AmJPh..87..765J }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)