Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Probability distribution
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Introduction== A probability distribution is a mathematical description of the probabilities of events, subsets of the [[sample space]]. The sample space, often represented in notation by <math>\ \Omega\ ,</math> is the [[Set (mathematics)|set]] of all possible [[outcome (probability)|outcomes]] of a random phenomenon being observed. The sample space may be any set: a set of [[real numbers]], a set of descriptive labels, a set of [[vector (mathematics)|vectors]], a set of arbitrary non-numerical values, etc. For example, the sample space of a coin flip could be {{math|{{nobr|Ξ© {{=}} {{big|<nowiki>{</nowiki>}}"heads", "tails"{{big|<nowiki>}</nowiki>}}}}.}} To define probability distributions for the specific case of [[random variables]] (so the sample space can be seen as a numeric set), it is common to distinguish between '''discrete''' and '''continuous''' [[random variable]]s. In the discrete case, it is sufficient to specify a [[probability mass function]] <math>p</math> assigning a probability to each possible outcome (e.g. when throwing a fair [[dice|die]], each of the six digits {{math|β1β}} to {{math|β6β}}, corresponding to the number of dots on the die, has probability <math>\tfrac{1}{6}).</math> The probability of an [[Event (probability theory)|event]] is then defined to be the sum of the probabilities of all outcomes that satisfy the event; for example, the probability of the event "the die rolls an even value" is <math display="block">p(\text{β}2\text{β}) + p(\text{β}4\text{β}) + p(\text{β}6\text{β}) = \frac{1}{6} + \frac{1}{6} + \frac{1}{6} = \frac{1}{2}.</math> In contrast, when a random variable takes values from a continuum then by convention, any individual outcome is assigned probability zero. For such continuous random variables, only events that include infinitely many outcomes such as intervals have probability greater than 0. For example, consider measuring the weight of a piece of ham in the supermarket, and assume the scale can provide arbitrarily many digits of precision. Then, the probability that it weighs ''exactly'' 500 [[gram|g]] must be zero because no matter how high the level of precision chosen, it cannot be assumed that there are no non-zero decimal digits in the remaining omitted digits ignored by the precision level. However, for the same use case, it is possible to meet quality control requirements such as that a package of "500 g" of ham must weigh between 490 g and 510 g with at least 98% probability. This is possible because this measurement does not require as much precision from the underlying equipment. [[File:Combined Cumulative Distribution Graphs.png|thumb|455x455px| Figure 1: The left graph shows a probability density function. The right graph shows the cumulative distribution function. The value at {{font color|#ED1C24|'''a'''}} in the cumulative distribution equals the area under the probability density curve up to the point {{font color|#ED1C24|'''a'''}}.]] Continuous probability distributions can be described by means of the [[cumulative distribution function]], which describes the probability that the random variable is no larger than a given value (i.e., {{math|''P''(''X'' β€ ''x'')}} for some {{mvar|x}}. The cumulative distribution function is the area under the [[probability density function]] from {{math|-β}} to {{mvar|x}}, as shown in figure 1.<ref name='dekking'>{{cite book |last=Dekking |first=Michel (1946β) |year=2005 |title=A Modern Introduction to Probability and Statistics : Understanding why and how |publisher=Springer |isbn=978-1-85233-896-1 |location=London, UK |oclc=262680588}}</ref> Most continuous probability distributions encountered in practice are not only continuous but also [[absolutely continuous]]. Such distributions can be described by their [[probability density function]]. Informally, the probability density <math>f</math> of a random variable <math>X</math> describes the [[infinitesimal]] probability that <math>X</math> takes any value <math>x</math> β that is <math>P(x \leq X < x + \Delta x) \approx f(x) \, \Delta x</math> as <math>\Delta x > 0</math> becomes is arbitrarily small. The probability that <math>X</math> lies in a given interval can be computed rigorously by [[Integration (mathematics)|integrating]] the probability density function over that interval.<ref name=":3"/>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)