Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Entropy (information theory)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Use in number theory== [[Terence Tao]] used entropy to make a useful connection trying to solve the [[Erdős discrepancy problem]].<ref>{{Cite web |last=Klarreich |first=Erica |author-link=Erica Klarreich |date=1 October 2015 |title=A Magical Answer to an 80-Year-Old Puzzle |url=https://www.quantamagazine.org/a-magical-answer-to-an-80-year-old-puzzle-20151001/ |access-date=18 August 2014 |website=[[Quanta Magazine]]}}</ref><ref>{{Cite journal |last=Tao |first=Terence |date=2016-02-28 |title=The Erdős discrepancy problem |url=https://discreteanalysisjournal.com/article/609 |journal=Discrete Analysis |language=en |arxiv=1509.05363v6 |doi=10.19086/da.609 |s2cid=59361755 |access-date=20 September 2023 |archive-date=25 September 2023 |archive-url=https://web.archive.org/web/20230925184904/https://discreteanalysisjournal.com/article/609 |url-status=live }}</ref> Intuitively the idea behind the proof was if there is low information in terms of the Shannon entropy between consecutive random variables (here the random variable is defined using the [[Liouville function]] (which is a useful mathematical function for studying distribution of primes) {{math|''X''<sub>''H''</sub>}} {{=}} <math>\lambda(n+H)</math>. And in an interval [n, n+H] the sum over that interval could become arbitrary large. For example, a sequence of +1's (which are values of {{math|''X''<sub>''H''</sub>}} could take) have trivially low entropy and their sum would become big. But the key insight was showing a reduction in entropy by non negligible amounts as one expands H leading inturn to unbounded growth of a mathematical object over this random variable is equivalent to showing the unbounded growth per the [[Erdős discrepancy problem]]. The proof is quite involved and it brought together breakthroughs not just in novel use of Shannon entropy, but also it used the [[Liouville function]] along with averages of modulated multiplicative functions<ref>https://arxiv.org/pdf/1502.02374.pdf {{Webarchive|url=https://web.archive.org/web/20231028111132/https://arxiv.org/pdf/1502.02374.pdf |date=28 October 2023 }}</ref> in short intervals. Proving it also broke the "parity barrier"<ref>https://terrytao.wordpress.com/2007/06/05/open-question-the-parity-problem-in-sieve-theory/ {{Webarchive|url=https://web.archive.org/web/20230807211237/https://terrytao.wordpress.com/2007/06/05/open-question-the-parity-problem-in-sieve-theory/ |date=7 August 2023 }}</ref> for this specific problem. While the use of Shannon entropy in the proof is novel it is likely to open new research in this direction.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)