Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Probability
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Mathematical treatment== [[File:probability vs odds.svg|thumb|right|Calculation of probability (risk) vs odds]] {{see also|Probability axioms}} Consider an experiment that can produce a number of results. The collection of all possible results is called the [[sample space]] of the experiment, sometimes denoted as <math>\Omega</math>. The [[power set]] of the sample space is formed by considering all different collections of possible results. For example, rolling a die can produce six possible results. One collection of possible results gives an odd number on the die. Thus, the subset {1,3,5} is an element of the [[power set]] of the sample space of dice rolls. These collections are called "events". In this case, {1,3,5} is the event that the die falls on some odd number. If the results that actually occur fall in a given event, the event is said to have occurred. A probability is a [[Function (mathematics)|way of assigning]] every event a value between zero and one, with the requirement that the event made up of all possible results (in our example, the event {1,2,3,4,5,6}) is assigned a value of one. To qualify as a probability, the assignment of values must satisfy the requirement that for any collection of mutually exclusive events (events with no common results, such as the events {1,6}, {3}, and {2,4}), the probability that at least one of the events will occur is given by the sum of the probabilities of all the individual events.<ref>{{cite book|last = Ross|first = Sheldon M. |title = A First course in Probability|edition= 8th |pages = 26β27|publisher = Pearson Prentice Hall|date = 2010|isbn = 9780136033134}}</ref> The probability of an [[Event (probability theory)|event]] ''A'' is written as <math>P(A)</math>,<ref name=":2">{{Cite web|last=Weisstein|first=Eric W.|title=Probability|url=https://mathworld.wolfram.com/Probability.html|access-date=2020-09-10|website=mathworld.wolfram.com|language=en}}</ref> <math>p(A)</math>, or <math>\text{Pr}(A)</math>.<ref>Olofsson (2005) p. 8.</ref> This mathematical definition of probability can extend to infinite sample spaces, and even uncountable sample spaces, using the concept of a measure. The ''opposite'' or ''complement'' of an event ''A'' is the event [not ''A''] (that is, the event of ''A'' not occurring), often denoted as <math>A', A^c</math>, <math>\overline{A}, A^\complement, \neg A</math>, or <math>{\sim}A</math>; its probability is given by {{nowrap|1= ''P''(not ''A'') = 1 β ''P''(''A'')}}.<ref>Olofsson (2005), p. 9</ref> As an example, the chance of not rolling a six on a six-sided die is {{nowrap|1=1 β (chance of rolling a six) =}} {{nowrap|1=1 β {{sfrac|1|6}} = {{sfrac|5|6}}.}} For a more comprehensive treatment, see [[Complementary event]]. If two events ''A'' and ''B'' occur on a single performance of an experiment, this is called the intersection or [[Joint distribution|joint probability]] of ''A'' and ''B'', denoted as <math>P(A \cap B).</math> ===Independent events=== If two events, ''A'' and ''B'' are [[Independence (probability theory)|independent]] then the joint probability is<ref name=":2" /> <math display="block" qid=Q120632573>P(A \mbox{ and }B) = P(A \cap B) = P(A) P(B).</math> [[File:Independent and Non-independent Probability Events.jpg|thumb|Events A and B depicted as independent vs non-independent in space Ξ©]] For example, if two coins are flipped, then the chance of both being heads is <math>\tfrac{1}{2}\times\tfrac{1}{2} = \tfrac{1}{4}.</math><ref>Olofsson (2005) p. 35.</ref> ===Mutually exclusive events=== {{Main|Mutual exclusivity}} If either event ''A'' or event ''B'' can occur but never both simultaneously, then they are called mutually exclusive events. If two events are [[Mutually exclusive events|mutually exclusive]], then the probability of ''both'' occurring is denoted as <math>P(A \cap B)</math> and<math display="block">P(A \mbox{ and }B) = P(A \cap B) = 0</math> If two events are [[Mutually exclusive events|mutually exclusive]], then the probability of ''either'' occurring is denoted as <math>P(A \cup B)</math> and<math display="block">P(A\mbox{ or }B) = P(A \cup B)= P(A) + P(B) - P(A \cap B) = P(A) + P(B) - 0 = P(A) + P(B)</math> For example, the chance of rolling a 1 or 2 on a six-sided die is <math>P(1\mbox{ or }2) = P(1) + P(2) = \tfrac{1}{6} + \tfrac{1}{6} = \tfrac{1}{3}.</math> ===Not (necessarily) mutually exclusive events=== If the events are not (necessarily) mutually exclusive then<math display="block">P\left(A \hbox{ or } B\right) = P(A \cup B) = P\left(A\right)+P\left(B\right)-P\left(A \mbox{ and } B\right).</math> Rewritten,<math display="block"> P\left( A\cup B\right) =P\left( A\right) +P\left( B\right) -P\left( A\cap B\right) </math> For example, when drawing a card from a deck of cards, the chance of getting a heart or a face card (J, Q, K) (or both) is <math>\tfrac{13}{52} + \tfrac{12}{52} - \tfrac{3}{52} = \tfrac{11}{26},</math> since among the 52 cards of a deck, 13 are hearts, 12 are face cards, and 3 are both: here the possibilities included in the "3 that are both" are included in each of the "13 hearts" and the "12 face cards", but should only be counted once. This can be expanded further for multiple not (necessarily) mutually exclusive events. For three events, this proceeds as follows:<math display="block"> \begin{aligned}P\left( A\cup B\cup C\right) =&P\left( \left( A\cup B\right) \cup C\right) \\ =&P\left( A\cup B\right) +P\left( C\right) -P\left( \left( A\cup B\right) \cap C\right) \\ =&P\left( A\right) +P\left( B\right) -P\left( A\cap B\right) +P\left( C\right) -P\left( \left( A\cap C\right) \cup \left( B\cap C\right) \right) \\ =&P\left( A\right) +P\left( B\right) +P\left( C\right) -P\left( A\cap B\right) -\left( P\left( A\cap C\right) +P\left( B\cap C\right) -P\left( \left( A\cap C\right) \cap \left( B\cap C\right) \right) \right) \\ P\left( A\cup B\cup C\right) =&P\left( A\right) +P\left( B\right) +P\left( C\right) -P\left( A\cap B\right) -P\left( A\cap C\right) -P\left( B\cap C\right) +P\left( A\cap B\cap C\right) \end{aligned} </math>It can be seen, then, that this pattern can be repeated for any number of events. ===Conditional probability=== ''[[Conditional probability]]'' is the probability of some event ''A'', given the occurrence of some other event ''B''. Conditional probability is written <math>P(A \mid B)</math>, and is read "the probability of ''A'', given ''B''". It is defined by<ref>Olofsson (2005) p. 29.</ref> <math display="block">P(A \mid B) = \frac{P(A \cap B)}{P(B)}\,</math> If <math>P(B)=0</math> then <math>P(A \mid B)</math> is formally [[undefined (mathematics)|undefined]] by this expression. In this case <math>A</math> and <math>B</math> are independent, since <math>P(A \cap B) = P(A)P(B) = 0.</math> However, it is possible to define a conditional probability for some zero-probability events, for example by using a [[Ο-algebra]] of such events (such as those arising from a [[continuous random variable]]).<ref>{{Cite web |title=Conditional probability with respect to a sigma-algebra |url=https://www.statlect.com/fundamentals-of-probability/conditional-probability-as-a-random-variable |access-date=2022-07-04 |website=statlect.com}}</ref> For example, in a bag of 2 red balls and 2 blue balls (4 balls in total), the probability of taking a red ball is <math>1/2;</math> however, when taking a second ball, the probability of it being either a red ball or a blue ball depends on the ball previously taken. For example, if a red ball was taken, then the probability of picking a red ball again would be <math>1/3,</math> since only 1 red and 2 blue balls would have been remaining. And if a blue ball was taken previously, the probability of taking a red ball will be <math>2/3.</math> ===Inverse probability=== {{Main|Inverse probability}} In [[probability theory]] and applications, ''[[Bayes' theorem|Bayes' rule]]'' relates the [[odds]] of event <math>A_1</math> to event <math>A_2,</math> before (prior to) and after (posterior to) [[Conditional probability|conditioning]] on another event <math>B.</math> The odds on <math>A_1</math> to event <math>A_2</math> is simply the ratio of the probabilities of the two events. When arbitrarily many events <math>A</math> are of interest, not just two, the rule can be rephrased as ''posterior is proportional to prior times likelihood'', <math>P(A|B)\propto P(A) P(B|A)</math> where the proportionality symbol means that the left hand side is proportional to (i.e., equals a constant times) the right hand side as <math>A</math> varies, for fixed or given <math>B</math> (Lee, 2012; Bertsch McGrayne, 2012). In this form it goes back to Laplace (1774) and to Cournot (1843); see Fienberg (2005). ===Summary of probabilities=== {| class="wikitable plainrowheaders" style="text-align: left;" |+Summary of probabilities |- ! scope="col" | Event ! scope="col" | Probability |- ! scope="row" style="text-align: center;" | A | <math>P(A)\in[0,1]</math> |- ! scope="row" style="text-align: center;" | not A | <math>P(A^\complement)=1-P(A)\,</math> |- ! scope="row" style="text-align: center;" | A or B | <math>\begin{align} P(A\cup B) & = P(A)+P(B)-P(A\cap B) \\ P(A\cup B) & = P(A)+P(B) \qquad\mbox{if A and B are mutually exclusive} \\ \end{align}</math> |- ! scope="row" style="text-align: center;" | A and B | <math>\begin{align} P(A\cap B) & = P(A|B)P(B) = P(B|A)P(A)\\ P(A\cap B) & = P(A)P(B) \qquad\mbox{if A and B are independent}\\ \end{align}</math> |- ! scope="row" style="text-align: center;" | A given B | <math>P(A \mid B) = \frac{P(A \cap B)}{P(B)} = \frac{P(B|A)P(A)}{P(B)} \,</math> |}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)