Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Chebyshev's inequality
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Bound on probability of a random variable being far from its mean}} {{For|the similarly named inequality involving series|Chebyshev's sum inequality}} In [[probability theory]], '''Chebyshev's inequality''' (also called the '''Bienaymé–Chebyshev inequality''') provides an upper bound on the probability of deviation of a [[random variable]] (with finite variance) from its mean. More specifically, the probability that a random variable deviates from its mean by more than <math>k\sigma</math> is at most <math>1/k^2</math>, where <math>k</math> is any positive constant and <math>\sigma</math> is the [[standard deviation]] (the square root of the variance). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the [[weak law of large numbers]]. Its practical usage is similar to the [[68–95–99.7 rule]], which applies only to [[normal distribution]]s. Chebyshev's inequality is more general, stating that a minimum of just 75% of values must lie within two standard deviations of the mean and 88.88% within three standard deviations for a broad range of different [[probability distributions]].<ref name=Kvanli>{{cite book |last1=Kvanli |first1=Alan H. |last2=Pavur |first2=Robert J. |last3=Keeling |first3= Kellie B. |title=Concise Managerial Statistics |url=https://books.google.com/books?id=h6CQ1J0gwNgC&pg=PT95 |year=2006 |publisher=[[cEngage Learning]] |isbn=978-0-324-22388-0 |pages=81–82}}</ref><ref name=Chernick>{{cite book |last=Chernick |first=Michael R. |title=The Essentials of Biostatistics for Physicians, Nurses, and Clinicians |url=https://books.google.com/books?id=JP4azqd8ONEC&pg=PA50 |year=2011 |publisher=[[John Wiley & Sons]] |isbn=978-0-470-64185-9 |pages=49–50}}</ref> The term ''Chebyshev's inequality'' may also refer to [[Markov's inequality]], especially in the context of analysis. They are closely related, and some authors refer to [[Markov's inequality]] as "Chebyshev's First Inequality," and the similar one referred to on this page as "Chebyshev's Second Inequality." Chebyshev's inequality is tight in the sense that for each chosen positive constant, there exists a random variable such that the inequality is in fact an equality.<ref>{{Cite web |title=Error Term of Chebyshev inequality? |url=https://math.stackexchange.com/a/776424/352034 |access-date=2023-12-11 |website=Mathematics Stack Exchange |language=en}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)