Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Information theory === {{Main|Entropy (information theory)|Entropy in thermodynamics and information theory|Entropic uncertainty}} {{quote box |align=right |width=30em |quote=I thought of calling it "information", but the word was overly used, so I decided to call it "uncertainty". [...] Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. |source=Conversation between [[Claude Shannon]] and [[John von Neumann]] regarding what name to give to the [[attenuation]] in phone-line signals<ref>{{cite journal | last1 = Tribus | first1 = M. | last2 = McIrvine | first2 = E. C. | year = 1971 | title = Energy and information | url=https://www.jstor.org/stable/24923125 | journal = Scientific American | volume = 224 | issue = 3 | pages = 178–184 | jstor = 24923125 }}</ref>}} When viewed in terms of [[information theory]], the entropy state function is the amount of information in the system that is needed to fully specify the microstate of the system. [[Entropy (information theory)|Entropy]] is the measure of the amount of missing information before reception.<ref>{{cite book|first=Roger |last=Balian |chapter=Entropy, a Protean concept |editor-last=Dalibard |editor-first=Jean |title=Poincaré Seminar 2003: Bose-Einstein condensation – entropy |year=2004 |publisher=Birkhäuser |location=Basel |isbn=978-3-7643-7116-6 |pages=119–144}}</ref> Often called ''Shannon entropy'', it was originally devised by [[Claude Shannon]] in 1948 to study the size of information of a transmitted message. The definition of information entropy is expressed in terms of a discrete set of probabilities <math display="inline">p_i</math> so that:<math display="block">H(X) = - \sum_{i=1}^n{p(x_i) \log{p(x_i)}}</math>where the base of the logarithm determines the units (for example, the [[binary logarithm]] corresponds to [[bit]]s). In the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of the average size of information of a message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of binary questions needed to determine the content of the message.<ref name="Perplexed" /> Most researchers consider information entropy and thermodynamic entropy directly linked to the same concept,<ref>{{Cite book|last=Brillouin|first=Leon|title=Science and Information Theory|year= 1956|publisher=Dover Publications |isbn=978-0-486-43918-1}}</ref><ref name="Georgescu-Roegen 1971">{{Cite book|last=Georgescu-Roegen|first=Nicholas|title=The Entropy Law and the Economic Process|publisher=Harvard University Press|year=1971|isbn=978-0-674-25781-8 |url = https://archive.org/details/entropylawe00nich}}</ref><ref>{{Cite book|last=Chen|first=Jing|title=The Physical Foundation of Economics – an Analytical Thermodynamic Theory|publisher=World Scientific|year=2005|isbn=978-981-256-323-1}}</ref><ref>{{cite journal | last1 = Kalinin | first1 = M.I. | last2 = Kononogov | first2 = S.A. | year = 2005 | title = Boltzmann's constant | journal = Measurement Techniques | volume = 48 | issue = 7| pages = 632–636 | doi=10.1007/s11018-005-0195-9| bibcode = 2005MeasT..48..632K | s2cid = 118726162 }}</ref><ref>{{cite book|last1=Ben-Naim|first1=Arieh|title=Entropy demystified the second law reduced to plain common sense|url=https://archive.org/details/entropydemystifi0000benn|url-access=registration|date= 2008|publisher=World Scientific|location=Singapore|isbn=9789812832269|edition=Expanded}}</ref> while others argue that they are distinct.<ref>{{cite book|first1=Joseph J.|last1=Vallino|first2=Christopher K. |last2=Algar|first3=Nuria Fernández|last3=González|first4=Julie A.|last4=Huber|editor-first1=Roderick C.|editor-last1=Dewar|editor-first2=Charles H. |editor-last2=Lineweaver|editor-first3=Robert K.|editor-last3=Niven|editor-first4=Klaus|editor-last4=Regenauer-Lieb|date= 2013|chapter=Use of Receding Horizon Optimal Control to Solve MaxEP-Based (max entropy production) Biogeochemistry Problems |department=Living Systems as Catalysts|chapter-url=https://books.google.com/books?id=xF65BQAAQBAJ&pg=PA340|title=Beyond the Second Law: Entropy Production & Non-equilibrium Systems|page=340|isbn=978-3642401534|publisher=Springer |access-date=31 August 2019 |quote=...ink on the page forms a pattern that contains information, the entropy of the page is lower than a page with randomized letters; however, the reduction of entropy is trivial compared to the entropy of the paper the ink is written on. If the paper is burned, it hardly matters in a thermodynamic context if the text contains the meaning of life or only {{sic|jibberish}}.}}</ref> Both expressions are mathematically similar. If <math display="inline">W</math> is the number of microstates that can yield a given macrostate, and each microstate has the same ''[[A priori knowledge|a priori]]'' probability, then that probability is <math display="inline">p = 1/W</math>. The Shannon entropy (in [[Nat (unit)|nats]]) is:<math display="block">H = - \sum_{i=1}^W{p_i \ln{p_i}} = \ln{W}</math>and if entropy is measured in units of <math display="inline">k</math> per nat, then the entropy is given by:<math display="block">H = k \ln{W}</math>which is the [[Boltzmann's entropy formula|Boltzmann entropy formula]], where <math display="inline">k</math> is the Boltzmann constant, which may be interpreted as the thermodynamic entropy per nat. Some authors argue for dropping the word entropy for the <math display="inline">H</math> function of information theory and using Shannon's other term, "uncertainty", instead.<ref>Schneider, Tom, DELILA system (Deoxyribonucleic acid Library Language), (Information Theory Analysis of binding sites), Laboratory of Mathematical Biology, National Cancer Institute, Frederick, MD.</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)