Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Information theory
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Scientific study of digital information}} {{distinguish|Information science}} {{Information theory}} '''Information theory''' is the mathematical study of the [[quantification (science)|quantification]], [[Data storage|storage]], and [[telecommunications|communication]] of [[information]]. The field was established and formalized by [[Claude Shannon]] in the 1940s,<ref>{{Cite journal |last=Schneider |first=Thomas D. |date=2006 |title=Claude Shannon: Biologist |journal=IEEE Engineering in Medicine and Biology Magazine: The Quarterly Magazine of the Engineering in Medicine & Biology Society |volume=25 |issue=1 |pages=30–33 |doi=10.1109/memb.2006.1578661 |issn=0739-5175 |pmc=1538977 |pmid=16485389}}</ref> though early contributions were made in the 1920s through the works of [[Harry Nyquist]] and [[Ralph Hartley]]. It is at the intersection of [[electronic engineering]], [[mathematics]], [[statistics]], [[computer science]], [[Neuroscience|neurobiology]], [[physics]], and [[electrical engineering]].<ref name=":2">{{Cite journal |last1=Cruces |first1=Sergio |last2=Martín-Clemente |first2=Rubén |last3=Samek |first3=Wojciech |date=2019-07-03 |title=Information Theory Applications in Signal Processing |journal=Entropy |volume=21 |issue=7 |pages=653 |doi=10.3390/e21070653 |doi-access=free |issn=1099-4300 |pmc=7515149 |pmid=33267367|bibcode=2019Entrp..21..653C }}</ref><ref name=":0">{{Cite book |url=https://books.google.com/books?id=TNpVEAAAQBAJ&pg=PA23 |title=Fractional Order Systems and Applications in Engineering |publisher=Academic Press |year=2023 |isbn=978-0-323-90953-2 |editor-last=Baleanu |editor-first=D. |series=Advanced Studies in Complex Systems |location=London, United Kingdom |pages=23 |language=en |oclc=on1314337815 |editor-last2=Balas |editor-first2=Valentina Emilia |editor-last3=Agarwal |editor-first3=Praveen}}</ref> A key measure in information theory is [[information entropy|entropy]]. Entropy quantifies the amount of uncertainty involved in the value of a [[random variable]] or the outcome of a [[random process]]. For example, identifying the outcome of a [[Fair coin|fair]] [[coin flip]] (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a [[dice|die]] (which has six equally likely outcomes). Some other important measures in information theory are [[mutual information]], [[channel capacity]], [[error exponent]]s, and [[relative entropy]]. Important sub-fields of information theory include [[source coding]], [[algorithmic complexity theory]], [[algorithmic information theory]] and [[information-theoretic security]]. Applications of fundamental topics of information theory include source coding/[[data compression]] (e.g. for [[ZIP (file format)|ZIP files]]), and channel coding/[[error detection and correction]] (e.g. for [[digital subscriber line|DSL]]). Its impact has been crucial to the success of the [[Voyager program|Voyager]] missions to deep space,<ref name=":22">{{Cite web |last=Horgan |first=John |author-link=John Horgan (journalist) |date=2016-04-27 |title=Claude Shannon: Tinkerer, Prankster, and Father of Information Theory |url=https://spectrum.ieee.org/claude-shannon-tinkerer-prankster-and-father-of-information-theory |access-date=2024-11-08 |website=[[IEEE]] |language=en}}</ref> the invention of the [[compact disc]], the feasibility of mobile phones and the development of the [[Internet]] and [[artificial intelligence]].<ref>{{Cite book |last=Shi |first=Zhongzhi |url=https://books.google.com/books?id=xMTFCgAAQBAJ |title=Advanced Artificial Intelligence |date=2011 |publisher=[[World Scientific Publishing]] |isbn=978-981-4291-34-7 |pages=2 |language=en |doi=10.1142/7547}}</ref><ref>{{Cite book |last1=Sinha |first1=Sudhi |url=https://books.google.com/books?id=2pb-DwAAQBAJ |title=Reimagining Businesses with AI |last2=Al Huraimel |first2=Khaled |date=2020-10-20 |publisher=Wiley |isbn=978-1-119-70915-2 |edition=1 |pages=4 |language=en |doi=10.1002/9781119709183}}</ref><ref name=":0" /> The theory has also found applications in other areas, including [[statistical inference]],<ref>{{cite book|last1=Burnham|first1=K. P.|last2=Anderson|first2=D. R.|year=2002|title=Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach|edition=Second|language=en|publisher=Springer Science|location=New York|isbn=978-0-387-95364-9}}</ref> [[cryptography]], [[neurobiology]],<ref name="Spikes">{{cite book|title=Spikes: Exploring the Neural Code|author1=F. Rieke|author2=D. Warland|author3=R Ruyter van Steveninck|author4=W Bialek|publisher=The MIT press|year=1997|isbn=978-0262681087}}</ref> [[perception]],<ref>{{Cite journal|last1=Delgado-Bonal|first1=Alfonso|last2=Martín-Torres|first2=Javier|date=2016-11-03|title=Human vision is determined based on information theory|journal=Scientific Reports|language=En|volume=6|issue=1|pages=36038|bibcode=2016NatSR...636038D|doi=10.1038/srep36038|issn=2045-2322|pmc=5093619|pmid=27808236}}</ref> [[signal processing]],<ref name=":2" /> [[linguistics]], the evolution<ref>{{cite journal|last1=cf|last2=Huelsenbeck|first2=J. P.|last3=Ronquist|first3=F.|last4=Nielsen|first4=R.|last5=Bollback|first5=J. P.|year=2001|title=Bayesian inference of phylogeny and its impact on evolutionary biology|journal=Science|volume=294|issue=5550|pages=2310–2314|bibcode=2001Sci...294.2310H|doi=10.1126/science.1065889|pmid=11743192|s2cid=2138288}}</ref> and function<ref>{{cite journal|last1=Allikmets|first1=Rando|last2=Wasserman|first2=Wyeth W.|last3=Hutchinson|first3=Amy|last4=Smallwood|first4=Philip|last5=Nathans|first5=Jeremy|last6=Rogan|first6=Peter K.|year=1998|title=Thomas D. Schneider], Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences|url=http://alum.mit.edu/www/toms/|journal=Gene|volume=215|issue=1|pages=111–122|doi=10.1016/s0378-1119(98)00269-8|pmid=9666097|doi-access=free}}</ref> of molecular codes ([[bioinformatics]]), [[thermal physics]],<ref>{{cite journal|last1=Jaynes|first1=E. T.|year=1957|title=Information Theory and Statistical Mechanics|url=http://bayes.wustl.edu/|journal=Phys. Rev.|volume=106|issue=4|page=620|bibcode=1957PhRv..106..620J|doi=10.1103/physrev.106.620|s2cid=17870175 }}</ref> [[molecular dynamics]],<ref>{{Cite journal|last1=Talaat|first1=Khaled|last2=Cowen|first2=Benjamin|last3=Anderoglu|first3=Osman|date=2020-10-05|title=Method of information entropy for convergence assessment of molecular dynamics simulations|journal=Journal of Applied Physics|language=En|volume=128|issue=13|pages=135102|doi=10.1063/5.0019078|bibcode=2020JAP...128m5102T|osti=1691442|s2cid=225010720|doi-access=free}}</ref> [[black hole]]s, [[quantum computing]], [[information retrieval]], [[Intelligence (Information Gathering)|intelligence gathering]], [[plagiarism detection]],<ref>{{cite journal|last1=Bennett|first1=Charles H.|last2=Li|first2=Ming|last3=Ma|first3=Bin|year=2003|title=Chain Letters and Evolutionary Histories|url=http://sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75|journal=[[Scientific American]]|volume=288|issue=6|pages=76–81|bibcode=2003SciAm.288f..76B|doi=10.1038/scientificamerican0603-76|pmid=12764940|access-date=2008-03-11|archive-url=https://web.archive.org/web/20071007041539/http://www.sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75|archive-date=2007-10-07|url-status=dead}}</ref> [[pattern recognition]], [[anomaly detection]],<ref>{{Cite web|url=http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf|title=Some background on why people in the empirical sciences may want to better understand the information-theoretic methods|author=David R. Anderson|date=November 1, 2003|archive-url=https://web.archive.org/web/20110723045720/http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf|archive-date=July 23, 2011|url-status=dead|access-date=2010-06-23}} </ref> the analysis of [[music]],<ref>{{Citation |last=Loy |first=D. Gareth |title=Music, Expectation, and Information Theory |date=2017 |work=The Musical-Mathematical Mind: Patterns and Transformations |series=Computational Music Science |pages=161–169 |editor-last=Pareyon |editor-first=Gabriel |url=https://link.springer.com/chapter/10.1007/978-3-319-47337-6_17 |access-date=2024-09-19 |place=Cham |publisher=Springer International Publishing |language=en |doi=10.1007/978-3-319-47337-6_17 |isbn=978-3-319-47337-6 |editor2-last=Pina-Romero |editor2-first=Silvia |editor3-last=Agustín-Aquino |editor3-first=Octavio A. |editor4-last=Lluis-Puebla |editor4-first=Emilio}}</ref><ref>{{Cite journal |last1=Rocamora |first1=Martín |last2=Cancela |first2=Pablo |last3=Biscainho |first3=Luiz |date=2019-04-05 |title=Information Theory Concepts Applied to the Analysis of Rhythm in Recorded Music with Recurrent Rhythmic Patterns |url=http://www.aes.org/e-lib/browse.cfm?elib=20449 |journal=Journal of the Audio Engineering Society |volume=67 |issue=4 |pages=160–173 |doi=10.17743/jaes.2019.0003}}</ref> [[Art|art creation]],<ref>{{Cite journal |last=Marsden |first=Alan |date=2020 |title=New Prospects for Information Theory in Arts Research |url=https://direct.mit.edu/leon/article/53/3/274-280/96875 |journal=Leonardo |language=en |volume=53 |issue=3 |pages=274–280 |doi=10.1162/leon_a_01860 |issn=0024-094X}}</ref> [[imaging system]] design,<ref>{{Cite arXiv|title=Universal evaluation and design of imaging systems using information estimation|eprint=2405.20559 |last1=Pinkard |first1=Henry |last2=Kabuli |first2=Leyla |last3=Markley |first3=Eric |last4=Chien |first4=Tiffany |last5=Jiao |first5=Jiantao |last6=Waller |first6=Laura |date=2024 |class=physics.optics }}</ref> study of [[outer space]],<ref>{{Cite journal |last1=Wing |first1=Simon |last2=Johnson |first2=Jay R. |date=2019-02-01 |title=Applications of Information Theory in Solar and Space Physics |journal=Entropy |language=en |volume=21 |issue=2 |pages=140 |doi=10.3390/e21020140 |issn=1099-4300 |pmc=7514618 |pmid=33266856 |doi-access=free|bibcode=2019Entrp..21..140W }}</ref> the dimensionality of [[space]],<ref>{{Cite journal |last=Kak |first=Subhash |date=2020-11-26 |title=Information theory and dimensionality of space |journal=Scientific Reports |language=en |volume=10 |issue=1 |pages=20733 |doi=10.1038/s41598-020-77855-9 |pmid=33244156 |pmc=7693271 |issn=2045-2322}}</ref> and [[epistemology]].<ref>{{Cite journal |last=Harms |first=William F. |date=1998 |title=The Use of Information Theory in Epistemology |url=https://www.jstor.org/stable/188281 |journal=Philosophy of Science |volume=65 |issue=3 |pages=472–501 |doi=10.1086/392657 |jstor=188281 |issn=0031-8248}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)