Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Law of total probability
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Concept in probability theory}} {{Probability fundamentals}} In [[probability theory]], the '''law''' (or '''formula''') '''of total probability''' is a fundamental rule relating [[Marginal probability|marginal probabilities]] to [[conditional probabilities]]. It expresses the total probability of an outcome which can be realized via several distinct [[Event (probability theory)|events]], hence the name. ==Statement== The law of total probability is<ref name= ZK>Zwillinger, D., Kokoska, S. (2000) ''CRC Standard Probability and Statistics Tables and Formulae'', CRC Press. {{isbn|1-58488-059-7}} page 31.</ref> a [[theorem]] that states, in its discrete case, if <math>\left\{{B_n : n = 1, 2, 3, \ldots}\right\}</math> is a finite or [[Countable set|countably infinite]] set of [[mutually exclusive]] and [[collectively exhaustive]] events, then for any event <math>A</math> :<math>P(A)=\sum_n P(A\cap B_n)</math> or, alternatively,<ref name=ZK/> :<math>P(A)=\sum_n P(A\mid B_n)P(B_n),</math> where, for any <math>n</math>, if <math>P(B_n) = 0 </math>, then these terms are simply omitted from the summation since <math>P(A\mid B_n)</math> is finite. The summation can be interpreted as a [[weighted average]], and consequently the marginal probability, <math>P(A)</math>, is sometimes called "average probability";<ref name="Pfeiffer1978">{{cite book|author=Paul E. Pfeiffer|title=Concepts of probability theory|url=https://books.google.com/books?id=_mayRBczVRwC&pg=PA47|year=1978|publisher=Courier Dover Publications|isbn=978-0-486-63677-1|pages=47–48}}</ref> "overall probability" is sometimes used in less formal writings.<ref name="Rumsey2006">{{cite book|author=Deborah Rumsey|authorlink= Deborah J. Rumsey |title=Probability for dummies|url=https://books.google.com/books?id=Vj3NZ59ZcnoC&pg=PA58|year=2006|publisher=For Dummies|isbn=978-0-471-75141-0|page=58}}</ref> The law of total probability can also be stated for conditional probabilities: :<math>P( {A|C} ) = \frac{{P( {A,C} )}}{{P( C )}} = \frac{{\sum\limits_n {P( {A,{B_n},C} )} }}{{P( C )}} = \frac{{\sum\limits_n P ( {A\mid {B_n},C} )P( {{B_n}\mid C} )P( C )}}{{P( C )}} = \sum\limits_n P ( {A\mid {B_n},C} )P( {{B_n}\mid C} )</math> Taking the <math>B_n</math> as above, and assuming <math>C</math> is an event [[Independence (probability theory)|independent]] of any of the <math>B_n</math>: :<math>P(A \mid C) = \sum_n P(A \mid C,B_n) P(B_n) </math> ==Continuous case== The law of total probability extends to the case of conditioning on events generated by continuous random variables. Let <math>(\Omega, \mathcal{F}, P) </math> be a [[probability space]]. Suppose <math> X </math> is a random variable with distribution function <math>F_X</math>, and <math>A</math> an event on <math>(\Omega, \mathcal{F}, P) </math>. Then the law of total probability states <math>P(A) = \int_{-\infty}^\infty P(A |X = x) d F_X(x). </math> If <math>X</math> admits a density function <math>f_X</math>, then the result is <math>P(A) = \int_{-\infty}^\infty P(A |X = x) f_X(x) dx. </math> Moreover, for the specific case where <math>A = \{Y \in B \}</math>, where <math>B</math> is a Borel set, then this yields <math>P(Y \in B) = \int_{-\infty}^\infty P(Y \in B |X = x) f_X(x) dx. </math> ==Example== Suppose that two factories supply [[light bulb]]s to the market. Factory ''X''<nowiki>'</nowiki>s bulbs work for over 5000 hours in 99% of cases, whereas factory ''Y''<nowiki>'</nowiki>s bulbs work for over 5000 hours in 95% of cases. It is known that factory ''X'' supplies 60% of the total bulbs available and Y supplies 40% of the total bulbs available. What is the chance that a purchased bulb will work for longer than 5000 hours? Applying the law of total probability, we have: : <math> \begin{align} P(A) & = P(A\mid B_X) \cdot P(B_X) + P(A\mid B_Y) \cdot P(B_Y) \\[4pt] & = {99 \over 100} \cdot {6 \over 10} + {95 \over 100} \cdot {4 \over 10} = {{594 + 380} \over 1000} = {974 \over 1000} \end{align} </math> where * <math>P(B_X)={6 \over 10}</math> is the probability that the purchased bulb was manufactured by factory ''X''; * <math>P(B_Y)={4 \over 10}</math> is the probability that the purchased bulb was manufactured by factory ''Y''; * <math>P(A\mid B_X)={99 \over 100}</math> is the probability that a bulb manufactured by ''X'' will work for over 5000 hours; * <math>P(A\mid B_Y)={95 \over 100}</math> is the probability that a bulb manufactured by ''Y'' will work for over 5000 hours. Thus each purchased light bulb has a 97.4% chance to work for more than 5000 hours. ==Other names== The term '''''law of total probability''''' is sometimes taken to mean the '''law of alternatives''', which is a special case of the law of total probability applying to [[discrete random variable]]s.{{Citation needed|date=September 2010}} One author uses the terminology of the "Rule of Average Conditional Probabilities",<ref name="Pitman1993">{{cite book|author=Jim Pitman|title=Probability|url=https://books.google.com/books?id=AoDkBwAAQBAJ&q=pitman%20probability&pg=PA41|year=1993|publisher=Springer|isbn=0-387-97974-3|page=41}}</ref> while another refers to it as the "continuous law of alternatives" in the continuous case.<ref name="Baclawski2008">{{cite book|author=Kenneth Baclawski|title=Introduction to probability with R|url=https://books.google.com/books?id=Kglc9g5IPf4C&pg=PA179|year=2008|publisher=CRC Press|isbn=978-1-4200-6521-3|page=179}}</ref> This result is given by Grimmett and Welsh<ref>''Probability: An Introduction'', by [[Geoffrey Grimmett]] and [[Dominic Welsh]], Oxford Science Publications, 1986, Theorem 1B.</ref> as the '''partition theorem''', a name that they also give to the related [[law of total expectation]]. ==See also== * [[Law of large numbers]] * [[Law of total expectation]] * [[Law of total variance]] * [[Law of total covariance]] * [[Law of total cumulance]] * [[Marginal distribution]] == Notes == <references/> == References == * ''Introduction to Probability and Statistics'' by Robert J. Beaver, Barbara M. Beaver, Thomson Brooks/Cole, 2005, page 159. * ''Theory of Statistics'', by Mark J. Schervish, Springer, 1995. * ''Schaum's Outline of Probability, Second Edition'', by John J. Schiller, Seymour Lipschutz, McGraw–Hill Professional, 2010, page 89. * ''A First Course in Stochastic Models'', by H. C. Tijms, John Wiley and Sons, 2003, pages 431–432. * ''An Intermediate Course in Probability'', by Alan Gut, Springer, 1995, pages 5–6. {{DEFAULTSORT:Law Of Total Probability}} [[Category:Theorems in probability theory]] [[Category:Statistical laws]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Citation needed
(
edit
)
Template:Cite book
(
edit
)
Template:Isbn
(
edit
)
Template:P( C )
(
edit
)
Template:Probability fundamentals
(
edit
)
Template:Short description
(
edit
)