Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Information theory
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Overview== Information theory studies the transmission, processing, extraction, and utilization of [[information]]. Abstractly, information can be thought of as the resolution of uncertainty. In the case of communication of information over a noisy channel, this abstract concept was formalized in 1948 by [[Claude Shannon]] in a paper entitled ''[[A Mathematical Theory of Communication]]'', in which information is thought of as a set of possible messages, and the goal is to send these messages over a noisy channel, and to have the receiver reconstruct the message with low probability of error, in spite of the channel noise. Shannon's main result, the [[noisy-channel coding theorem]], showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.<ref name="Spikes" /> Coding theory is concerned with finding explicit methods, called ''codes'', for increasing the efficiency and reducing the error rate of data communication over noisy channels to near the channel capacity. These codes can be roughly subdivided into data compression (source coding) and [[error-correction]] (channel coding) techniques. In the latter case, it took many years to find the methods Shannon's work proved were possible.<ref>{{Cite book |last1=Berrou |first1=C. |last2=Glavieux |first2=A. |last3=Thitimajshima |first3=P. |chapter=Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 |date=May 1993 |title=Proceedings of ICC '93 - IEEE International Conference on Communications |chapter-url=https://ieeexplore.ieee.org/document/397441 |volume=2 |pages=1064β1070 vol.2 |doi=10.1109/ICC.1993.397441|isbn=0-7803-0950-2 }}</ref><ref>{{Cite journal |last=MacKay |first=D.J.C. |date=March 1999 |title=Good error-correcting codes based on very sparse matrices |url=https://ieeexplore.ieee.org/document/748992 |journal=IEEE Transactions on Information Theory |volume=45 |issue=2 |pages=399β431 |doi=10.1109/18.748992 |issn=1557-9654}}</ref> A third class of information theory codes are [[cryptographic algorithm]]s (both [[code (cryptography)|code]]s and [[cipher]]s). Concepts, methods and results from coding theory and information theory are widely used in cryptography and [[cryptanalysis]],<ref>{{Cite book |last1=Menezes |first1=Alfred J. |url=https://www.taylorfrancis.com/books/9780429881329 |title=Handbook of Applied Cryptography |last2=van Oorschot |first2=Paul C. |last3=Vanstone |first3=Scott A. |date=2018-12-07 |publisher=CRC Press |isbn=978-0-429-46633-5 |edition=1 |language=en |doi=10.1201/9780429466335}}</ref> such as the [[Ban (unit)|unit ban]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)