Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Kolmogorov complexity
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Relation to entropy== For dynamical systems, entropy rate and algorithmic complexity of the trajectories are related by a theorem of Brudno, that the equality <math>K(x;T) = h(T)</math> holds for almost all <math>x</math>.<ref>{{cite journal |first1=Stefano|last1=Galatolo |first2=Mathieu|last2=Hoyrup |first3=Cristóbal|last3=Rojas |title=Effective symbolic dynamics, random points, statistical behavior, complexity and entropy | journal=Information and Computation | volume=208 | pages=23–41 | year=2010| url=http://www.loria.fr/~hoyrup/random_ergodic.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://www.loria.fr/~hoyrup/random_ergodic.pdf |archive-date=2022-10-09 |url-status=live | doi=10.1016/j.ic.2009.05.001|arxiv=0801.0209 |s2cid=5555443 }}</ref> It can be shown<ref>{{cite arXiv |author=Alexei Kaltchenko |title=Algorithms for Estimating Information Distance with Application to Bioinformatics and Linguistics |year=2004 |eprint=cs.CC/0404039}}</ref> that for the output of [[Markov information source]]s, Kolmogorov complexity is related to the [[Entropy (information theory)|entropy]] of the information source. More precisely, the Kolmogorov complexity of the output of a Markov information source, normalized by the length of the output, converges almost surely (as the length of the output goes to infinity) to the [[Entropy (information theory)|entropy]] of the source. '''Theorem.''' (Theorem 14.2.5 <ref name=":1">{{cite book |last1=Cover |first1=Thomas M. |title=Elements of information theory |last2=Thomas |first2=Joy A. |publisher=Wiley-Interscience |year=2006 |isbn=0-471-24195-4 |edition=2nd}}</ref>) The conditional Kolmogorov complexity of a binary string <math>x_{1:n}</math> satisfies<math display="block">\frac 1n K(x_{1:n} | n ) \leq H_b\left(\frac 1n \sum_i x_i\right) + \frac{\log n}{2n} + O(1/n) </math>where <math>H_b</math> is the [[binary entropy function]] (not to be confused with the entropy rate).
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)