Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy (information theory)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Use in combinatorics== Entropy has become a useful quantity in [[combinatorics]]. ===Loomis–Whitney inequality=== A simple example of this is an alternative proof of the [[Loomis–Whitney inequality]]: for every subset {{math|''A'' ⊆ '''Z'''<sup>''d''</sup>}}, we have <math display="block"> |A|^{d-1}\leq \prod_{i=1}^{d} |P_{i}(A)|</math> where {{math|''P''<sub>''i''</sub>}} is the [[orthogonal projection]] in the {{math|''i''}}th coordinate: <math display="block"> P_{i}(A)=\{(x_{1}, \ldots, x_{i-1}, x_{i+1}, \ldots, x_{d}) : (x_{1}, \ldots, x_{d})\in A\}.</math> The proof follows as a simple corollary of [[Shearer's inequality]]: if {{math|''X''<sub>1</sub>, ..., ''X''<sub>''d''</sub>}} are random variables and {{math|''S''<sub>1</sub>, ..., ''S''<sub>''n''</sub>}} are subsets of {{math|{1, ..., ''d''}}} such that every integer between 1 and {{math|''d''}} lies in exactly {{math|''r''}} of these subsets, then <math display="block"> \Eta[(X_{1}, \ldots ,X_{d})]\leq \frac{1}{r}\sum_{i=1}^{n}\Eta[(X_{j})_{j\in S_{i}}]</math> where <math> (X_{j})_{j\in S_{i}}</math> is the Cartesian product of random variables {{math|''X''<sub>''j''</sub>}} with indexes {{math|''j''}} in {{math|''S''<sub>''i''</sub>}} (so the dimension of this vector is equal to the size of {{math|''S''<sub>''i''</sub>}}). We sketch how Loomis–Whitney follows from this: Indeed, let {{math|''X''}} be a uniformly distributed random variable with values in {{math|''A''}} and so that each point in {{math|''A''}} occurs with equal probability. Then (by the further properties of entropy mentioned above) {{math|Η(''X'') {{=}} log{{abs|''A''}}}}, where {{math|{{abs|''A''}}}} denotes the cardinality of {{math|''A''}}. Let {{math|''S''<sub>''i''</sub> {{=}} {1, 2, ..., ''i''−1, ''i''+1, ..., ''d''}}}. The range of <math>(X_{j})_{j\in S_{i}}</math> is contained in {{math|''P''<sub>''i''</sub>(''A'')}} and hence <math> \Eta[(X_{j})_{j\in S_{i}}]\leq \log |P_{i}(A)|</math>. Now use this to bound the right side of Shearer's inequality and exponentiate the opposite sides of the resulting inequality you obtain. ===Approximation to binomial coefficient=== For integers {{math|0 < ''k'' < ''n''}} let {{math|1=''q'' = ''k''/''n''}}. Then <math display="block">\frac{2^{n\Eta(q)}}{n+1} \leq \tbinom nk \leq 2^{n\Eta(q)},</math> where <ref>Aoki, New Approaches to Macroeconomic Modeling.</ref>{{rp|p=43}} <math display="block">\Eta(q) = -q \log_2(q) - (1-q) \log_2(1-q).</math> {| class="toccolours collapsible collapsed" width="80%" style="text-align:left; margin-1.5em;" !Proof (sketch) |- |Note that <math>\tbinom nk q^{qn}(1-q)^{n-nq}</math> is one term of the expression <math display="block">\sum_{i=0}^n \tbinom ni q^i(1-q)^{n-i} = (q + (1-q))^n = 1.</math> Rearranging gives the upper bound. For the lower bound one first shows, using some algebra, that it is the largest term in the summation. But then, <math display="block">\binom nk q^{qn}(1-q)^{n-nq} \geq \frac{1}{n+1}</math> since there are {{math|''n'' + 1}} terms in the summation. Rearranging gives the lower bound. |} A nice interpretation of this is that the number of binary strings of length {{math|''n''}} with exactly {{math|''k''}} many 1's is approximately <math>2^{n\Eta(k/n)}</math>.<ref>Probability and Computing, M. Mitzenmacher and E. Upfal, Cambridge University Press</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)