Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
H-theorem
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Connection to information theory == ''H'' is a forerunner of Shannon's [[information entropy]]. [[Claude Shannon]] denoted his measure of [[Entropy (information theory)|information entropy]] ''H'' after the H-theorem.<ref>Gleick 2011</ref> The article on Shannon's [[information entropy]] contains an [[Shannon entropy#Information entropy explained|explanation]] of the discrete counterpart of the quantity ''H'', known as the information entropy or information uncertainty (with a minus sign). By [[Shannon entropy#Extending discrete entropy to the continuous case: differential entropy|extending the discrete information entropy to the continuous information entropy]], also called [[differential entropy]], one obtains the expression in the equation from the section above, [[#Definition_and_meaning_of_Boltzmann's_H|Definition and Meaning of Boltzmann's H]], and thus a better feel for the meaning of ''H''. The ''H''-theorem's connection between information and entropy plays a central role in a recent controversy called the [[Black hole information paradox]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)