Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Information theory
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Applications to other fields== ===Intelligence uses and secrecy applications=== {{Unreferenced section|date=April 2024}} Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the [[Ban (unit)|ban]], was used in the [[Ultra (cryptography)|Ultra]] project, breaking the German [[Enigma machine]] code and hastening the [[Victory in Europe Day|end of World War II in Europe]]. Shannon himself defined an important concept now called the [[unicity distance]]. Based on the redundancy of the [[plaintext]], it attempts to give a minimum amount of [[ciphertext]] necessary to ensure unique decipherability. Information theory leads us to believe it is much more difficult to keep secrets than it might first appear. A [[brute force attack]] can break systems based on [[public-key cryptography|asymmetric key algorithms]] or on most commonly used methods of [[symmetric-key algorithm|symmetric key algorithms]] (sometimes called secret key algorithms), such as [[block cipher]]s. The security of all such methods comes from the assumption that no known attack can break them in a practical amount of time. [[Information theoretic security]] refers to methods such as the [[one-time pad]] that are not vulnerable to such brute force attacks. In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the [[key (cryptography)|key]]) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications. In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the [[Venona project]] was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material. ===Pseudorandom number generation=== {{Unreferenced section|date=April 2024}} [[Pseudorandom number generator]]s are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed [[cryptographically secure pseudorandom number generator]]s, but even they require [[random seed]]s external to the software to work as intended. These can be obtained via [[Extractor (mathematics)|extractors]], if done carefully. The measure of sufficient randomness in extractors is [[min-entropy]], a value related to Shannon entropy through [[Rényi entropy]]; Rényi entropy is also used in evaluating randomness in cryptographic systems. Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses. ===Seismic exploration=== One early commercial application of information theory was in the field of seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and [[digital signal processing]] offer a major improvement of resolution and image clarity over previous analog methods.<ref>{{cite journal|doi=10.1002/smj.4250020202 | volume=2 | issue=2 | title=The corporation and innovation | year=1981 | journal=Strategic Management Journal | pages=97–118 | last1 = Haggerty | first1 = Patrick E.}}</ref> ===Semiotics=== [[Semiotics|Semioticians]] {{ill|Doede Nauta|nl}} and [[Winfried Nöth]] both considered [[Charles Sanders Peirce]] as having created a theory of information in his works on semiotics.<ref name="Nauta 1972">{{cite book |last1=Nauta |first1=Doede |title=The Meaning of Information |date=1972 |publisher=Mouton |location=The Hague |isbn=9789027919960}}</ref>{{rp|171}}<ref name="Nöth 2012">{{cite journal |last1=Nöth |first1=Winfried |title=Charles S. Peirce's theory of information: a theory of the growth of symbols and of knowledge |journal=Cybernetics and Human Knowing |date=January 2012 |volume=19 |issue=1–2 |pages=137–161 |url=https://edisciplinas.usp.br/mod/resource/view.php?id=2311849}}</ref>{{rp|137}} Nauta defined semiotic information theory as the study of "''the internal processes of coding, filtering, and information processing.''"<ref name="Nauta 1972"/>{{rp|91}} Concepts from information theory such as redundancy and code control have been used by semioticians such as [[Umberto Eco]] and {{ill|Ferruccio Rossi-Landi|it}} to explain ideology as a form of message transmission whereby a dominant social class emits its message by using signs that exhibit a high degree of redundancy such that only one message is decoded among a selection of competing ones.<ref>Nöth, Winfried (1981). "[https://kobra.uni-kassel.de/bitstream/handle/123456789/2014122246977/semi_2004_002.pdf?sequence=1&isAllowed=y Semiotics of ideology]". ''Semiotica'', Issue 148.</ref> ===Integrated process organization of neural information=== Quantitative information theoretic methods have been applied in [[cognitive science]] to analyze the integrated process organization of neural information in the context of the [[binding problem]] in [[cognitive neuroscience]].<ref>{{cite book|last=Maurer|first=H.|year=2021|title=Cognitive Science: Integrative Synchronization Mechanisms in Cognitive Neuroarchitectures of the Modern Connectionism|language=en|publisher=CRC Press|location=Boca Raton/FL|chapter=Chapter 10: Systematic Class of Information Based Architecture Types|isbn=978-1-351-04352-6|doi=10.1201/9781351043526}}</ref> In this context, either an information-theoretical measure, such as {{em|functional clusters}} ([[Gerald Edelman]] and [[Giulio Tononi]]'s functional clustering model and dynamic core hypothesis (DCH)<ref>{{cite book|last1=Edelman|first1=G.M.|first2=G.|last2=Tononi|year=2000|title=A Universe of Consciousness: How Matter Becomes Imagination|language=en|publisher=Basic Books|location=New York|isbn=978-0465013777}}</ref>) or {{em|effective information}} (Tononi's [[integrated information theory]] (IIT) of consciousness<ref>{{cite journal|last1=Tononi|first1=G.|first2=O.|last2=Sporns|year=2003|title=Measuring information integration|journal=BMC Neuroscience|language=en|volume=4|pages=1–20|doi=10.1186/1471-2202-4-31|doi-access=free |pmid=14641936 |pmc=331407 }}</ref><ref>{{cite journal|last=Tononi|first=G.|year=2004a|title=An information integration theory of consciousness|journal=BMC Neuroscience|language=en|volume=5|pages=1–22|doi=10.1186/1471-2202-5-42|doi-access=free |pmid=15522121 |pmc=543470 }}</ref><ref>{{cite book|last=Tononi|first=G.|year=2004b|chapter=Consciousness and the brain: theoretical aspects|editor1-first=G.|editor1-last=Adelman|editor2-first=B.|editor2-last=Smith|title=Encyclopedia of Neuroscience|language=en|edition=3rd|publisher=Elsevier|location=Amsterdam, Oxford|chapter-url=https://www.researchgate.net/publication/265238140|archive-url=https://web.archive.org/web/20231202031406/https://www.jsmf.org/meetings/2003/nov/consciousness_encyclopedia_2003.pdf|archive-date=2023-12-02|url-status=live|isbn=0-444-51432-5}}</ref>), is defined (on the basis of a reentrant process organization, i.e. the synchronization of neurophysiological activity between groups of neuronal populations), or the measure of the minimization of free energy on the basis of statistical methods ([[Karl J. Friston]]'s [[free energy principle]] (FEP), an information-theoretical measure which states that every adaptive change in a self-organized system leads to a minimization of free energy, and the [[Bayesian brain]] hypothesis<ref>{{cite journal|last1=Friston|first1=K.|first2=K.E.|last2=Stephan|year=2007|title=Free-energy and the brain|journal=Synthese|language=en|volume=159|issue=3 |pages=417–458|doi=10.1007/s11229-007-9237-y|pmid=19325932 |pmc=2660582 }}</ref><ref>{{cite journal|last=Friston|first=K.|year=2010|title=The free-energy principle: a unified brain theory|journal=Nature Reviews Neuroscience|language=en|volume=11|issue=2 |pages=127–138|doi=10.1038/nrn2787|pmid=20068583 }}</ref><ref>{{cite journal|last1=Friston|first1=K.|first2=M.|last2=Breakstear|first3=G.|last3=Deco|year=2012|title=Perception and self-organized instability|journal=Frontiers in Computational Neuroscience|language=en|volume=6|pages=1–19|doi=10.3389/fncom.2012.00044|doi-access=free |pmid=22783185 |pmc=3390798 }}</ref><ref>{{cite journal|last=Friston|first=K.|year=2013|title=Life as we know it|journal=Journal of the Royal Society Interface|language=en|volume=10|issue=86 |pages=20130475|doi=10.1098/rsif.2013.0475|pmid=23825119 |pmc=3730701 }}</ref><ref>{{cite journal|last1=Kirchhoff|first1=M.|first2=T.|last2=Parr|first3=E.|last3=Palacios|first4=K.|last4=Friston|first5=J.|last5=Kiverstein|year=2018|title=The Markov blankets of life: autonomy, active inference and the free energy principle|journal=Journal of the Royal Society Interface|language=en|volume=15|issue=138 |pages=20170792|doi=10.1098/rsif.2017.0792|pmid=29343629 |pmc=5805980 }}</ref>). ===Miscellaneous applications=== Information theory also has applications in the [[search for extraterrestrial intelligence]],<ref>{{Cite journal |last1=Doyle |first1=Laurance R. |author-link=Laurance Doyle |last2=McCowan |first2=Brenda |author-link2=Brenda McCowan |last3=Johnston |first3=Simon |last4=Hanser |first4=Sean F. |date=February 2011 |title=Information theory, animal communication, and the search for extraterrestrial intelligence |journal=[[Acta Astronautica]] |language=en |volume=68 |issue=3–4 |pages=406–417 |doi=10.1016/j.actaastro.2009.11.018|bibcode=2011AcAau..68..406D }}</ref> [[black hole information paradox|black holes]],<ref>{{Cite journal |last=Bekenstein |first=Jacob D |date=2004 |title=Black holes and information theory |url=https://www.tandfonline.com/doi/abs/10.1080/00107510310001632523 |journal=Contemporary Physics |volume=45 |issue=1 |pages=31–43 |doi=10.1080/00107510310001632523 |arxiv=quant-ph/0311049 |bibcode=2004ConPh..45...31B |issn=0010-7514}}</ref> [[bioinformatics]],<ref>{{Cite journal |last=Vinga |first=Susana |date=2014-05-01 |title=Information theory applications for biological sequence analysis |url=https://academic.oup.com/bib/article/15/3/376/183705 |journal=Briefings in Bioinformatics |volume=15 |issue=3 |pages=376–389 |doi=10.1093/bib/bbt068 |issn=1467-5463 |pmc=7109941 |pmid=24058049}}</ref> and [[Gambling and information theory|gambling]].<ref>{{Citation |last=Thorp |first=Edward O. |title=The kelly criterion in blackjack sports betting, and the stock market* |date=2008-01-01 |work=Handbook of Asset and Liability Management |pages=385–428 |editor-last=Zenios |editor-first=S. A. |url=https://linkinghub.elsevier.com/retrieve/pii/B9780444532480500150 |access-date=2025-01-20 |place=San Diego |publisher=North-Holland |doi=10.1016/b978-044453248-0.50015-0 |isbn=978-0-444-53248-0 |editor2-last=Ziemba |editor2-first=W. T.}}</ref><ref>{{Cite journal |last=Haigh |first=John |date=2000 |title=The Kelly Criterion and Bet Comparisons in Spread Betting |url=https://rss.onlinelibrary.wiley.com/doi/10.1111/1467-9884.00251 |journal=Journal of the Royal Statistical Society, Series D (The Statistician) |language=en |volume=49 |issue=4 |pages=531–539 |doi=10.1111/1467-9884.00251 |issn=1467-9884}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)