Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Chernoff bound
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Exponentially decreasing bounds on tail distributions of random variables}} In [[probability theory]], a '''Chernoff bound''' is an exponentially decreasing upper bound on the tail of a random variable based on its [[moment generating function]]. The minimum of all such exponential bounds forms ''the'' Chernoff or '''Chernoff-Cramér bound''', which may decay faster than exponential (e.g. [[Sub-Gaussian distribution|sub-Gaussian]]).<ref name="blm">{{Cite book|last=Boucheron|first=Stéphane|url=https://www.worldcat.org/oclc/837517674|title=Concentration Inequalities: a Nonasymptotic Theory of Independence|date=2013|publisher=Oxford University Press|others=Gábor Lugosi, Pascal Massart|isbn=978-0-19-953525-5|location=Oxford|page=21|oclc=837517674}}</ref><ref>{{Cite web|last=Wainwright|first=M.|date=January 22, 2015|title=Basic tail and concentration bounds|url=https://www.stat.berkeley.edu/~mjwain/stat210b/Chap2_TailBounds_Jan22_2015.pdf|url-status=live|archive-url=https://web.archive.org/web/20160508170739/http://www.stat.berkeley.edu:80/~mjwain/stat210b/Chap2_TailBounds_Jan22_2015.pdf |archive-date=2016-05-08 }}</ref> It is especially useful for sums of independent random variables, such as sums of [[Bernoulli random variable]]s.<ref>{{Cite book|last=Vershynin|first=Roman|url=https://www.worldcat.org/oclc/1029247498|title=High-dimensional probability : an introduction with applications in data science|date=2018|isbn=978-1-108-41519-4|location=Cambridge, United Kingdom|oclc=1029247498|page=19}}</ref><ref>{{Cite journal|last=Tropp|first=Joel A.|date=2015-05-26|title=An Introduction to Matrix Concentration Inequalities|url=https://www.nowpublishers.com/article/Details/MAL-048|journal=Foundations and Trends in Machine Learning|language=English|volume=8|issue=1–2|page=60|doi=10.1561/2200000048|arxiv=1501.01571|s2cid=5679583|issn=1935-8237}}</ref> The bound is commonly named after [[Herman Chernoff]] who described the method in a 1952 paper,<ref>{{Cite journal|last=Chernoff|first=Herman|date=1952|title=A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations|journal=The Annals of Mathematical Statistics|volume=23|issue=4|pages=493–507|doi=10.1214/aoms/1177729330|jstor=2236576|issn=0003-4851|doi-access=free}}</ref> though Chernoff himself attributed it to Herman Rubin.<ref>{{cite book | url=http://www.crcpress.com/product/isbn/9781482204964 | title=Past, Present, and Future of Statistics | chapter=A career in statistics | page=35 | publisher=CRC Press | last1=Chernoff | first1=Herman | editor-first1=Xihong | editor-last1=Lin | editor-first2=Christian | editor-last2=Genest | editor-first3=David L. | editor-last3=Banks | editor-first4=Geert | editor-last4=Molenberghs | editor-first5=David W. | editor-last5=Scott | editor-first6=Jane-Ling | editor-last6=Wang | editor6-link = Jane-Ling Wang| year=2014 | isbn=9781482204964 | archive-url=https://web.archive.org/web/20150211232731/https://nisla05.niss.org/copss/past-present-future-copss.pdf | archive-date=2015-02-11 | chapter-url=https://nisla05.niss.org/copss/past-present-future-copss.pdf}}</ref> In 1938 [[Harald Cramér]] had published an almost identical concept now known as [[Cramér's theorem (large deviations)|Cramér's theorem]]. It is a sharper bound than the first- or second-moment-based tail bounds such as [[Markov's inequality]] or [[Chebyshev's inequality]], which only yield power-law bounds on tail decay. However, when applied to sums the Chernoff bound requires the random variables to be independent, a condition that is not required by either Markov's inequality or Chebyshev's inequality. The Chernoff bound is related to the [[Bernstein inequalities (probability theory)|Bernstein inequalities]]. It is also used to prove [[Hoeffding's inequality]], [[Bennett's inequality]], and [[Doob martingale#McDiarmid's inequality|McDiarmid's inequality]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)