Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Independence (probability theory)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|When the occurrence of one event does not affect the likelihood of another}} {{Probability fundamentals}} '''Independence''' is a fundamental notion in [[probability theory]], as in [[statistics]] and the theory of [[stochastic processes]]. Two [[event (probability theory)|event]]s are '''independent''', '''statistically independent''', or '''stochastically independent'''<ref name="Artificial Intelligence">{{cite book | last1 = Russell| first1 =Stuart| last2 = Norvig | first2 = Peter | title = Artificial Intelligence: A Modern Approach | url = https://archive.org/details/artificialintell00russ_726| url-access = limited| page = [https://archive.org/details/artificialintell00russ_726/page/n506 478] | publisher = [[Prentice Hall]] | year = 2002 | isbn = 0-13-790395-2}}</ref> if, informally speaking, the occurrence of one does not affect the probability of occurrence of the other or, equivalently, does not affect the [[odds]]. Similarly, two [[random variable]]s are independent if the realization of one does not affect the [[probability distribution]] of the other. When dealing with collections of more than two events, two notions of independence need to be distinguished. The events are called [[Pairwise independence|pairwise independent]] if any two events in the collection are independent of each other, while '''mutual independence''' (or '''collective independence''') of events means, informally speaking, that each event is independent of any combination of other events in the collection. A similar notion exists for collections of random variables. Mutual independence implies pairwise independence, but not the other way around. In the standard literature of probability theory, statistics, and stochastic processes, '''independence''' without further qualification usually refers to mutual independence.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)