Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Theoretical computer science
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Information theory=== {{main|Information theory}} [[Information theory]] is a branch of [[applied mathematics]], [[electrical engineering]], and [[computer science]] involving the [[Quantification (science)|quantification]] of [[information]]. Information theory was developed by [[Claude E. Shannon]] to find fundamental limits on [[signal processing]] operations such as [[data compression|compressing data]] and on reliably [[Computer data storage|storing]] and [[Telecommunications|communicating]] data. Since its inception it has broadened to find applications in many other areas, including [[statistical inference]], [[natural language processing]], [[cryptography]], [[neurobiology]],<ref>{{cite book|author1=F. Rieke |author2=D. Warland |author3=R Ruyter van Steveninck |author4=W Bialek |title=Spikes: Exploring the Neural Code|publisher=The MIT press|year=1997|isbn=978-0262681087}}</ref> the evolution<ref>{{cite journal | last1=Huelsenbeck | first1=J. P. |first2=F. |last2=Ronquist|first3=R. |last3=Nielsen |first4=J. P. |last4=Bollback| title=Bayesian Inference of Phylogeny and Its Impact on Evolutionary Biology | journal=Science | publisher=American Association for the Advancement of Science (AAAS) | volume=294 | issue=5550 | date=2001-12-14 | issn=0036-8075 | doi=10.1126/science.1065889 | pages=2310β2314| pmid=11743192 | bibcode=2001Sci...294.2310H | s2cid=2138288 }}</ref> and function<ref>Rando Allikmets, Wyeth W. Wasserman, Amy Hutchinson, Philip Smallwood, Jeremy Nathans, Peter K. Rogan, [http://alum.mit.edu/www/toms/ Thomas D. Schneider], Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences, ''Gene'' '''215''':1, 111β122</ref> of molecular codes, [[model selection]] in statistics,<ref>Burnham, K. P. and Anderson D. R. (2002) ''Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, Second Edition'' (Springer Science, New York) {{isbn|978-0-387-95364-9}}.</ref> thermal physics,<ref>{{cite journal | last=Jaynes | first=E. T. | title=Information Theory and Statistical Mechanics | journal=[[Physical Review]] | publisher=American Physical Society (APS) | volume=106 | issue=4 | date=1957-05-15 | issn=0031-899X | doi=10.1103/physrev.106.620 | pages=620β630| bibcode=1957PhRv..106..620J | s2cid=17870175 }}</ref> [[quantum computing]], [[linguistics]], plagiarism detection,<ref>Charles H. Bennett, Ming Li, and Bin Ma (2003) [http://sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75 Chain Letters and Evolutionary Histories] {{Webarchive|url=https://web.archive.org/web/20071007041539/http://sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75 |date=2007-10-07 }}, ''Scientific American'' '''288''':6, 76β81</ref> [[pattern recognition]], [[anomaly detection]] and other forms of [[data analysis]].<ref> {{Cite web |author = David R. Anderson |title = Some background on why people in the empirical sciences may want to better understand the information-theoretic methods |date = November 1, 2003 |url = http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf |access-date = 2010-06-23 |url-status = dead |archive-url = https://web.archive.org/web/20110723045720/http://aicanderson2.home.comcast.net/~aicanderson2/home.pdf |archive-date = July 23, 2011 }} </ref> Applications of fundamental topics of information theory include [[lossless data compression]] (e.g. [[ZIP (file format)|ZIP files]]), [[lossy data compression]] (e.g. [[MP3]]s and [[JPEG]]s), and [[channel capacity|channel coding]] (e.g. for [[DSL|Digital Subscriber Line (DSL)]]). The field is at the intersection of [[mathematics]], [[statistics]], [[computer science]], [[physics]], [[neurobiology]], and [[electrical engineering]]. Its impact has been crucial to the success of the [[Voyager program|Voyager]] missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the [[Internet]], the study of [[linguistics]] and of human perception, the understanding of [[black hole]]s, and numerous other fields. Important sub-fields of information theory are [[source coding]], [[channel coding]], [[algorithmic complexity theory]], [[algorithmic information theory]], [[information-theoretic security]], and measures of information.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)