Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Indeterminism
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Probabilistic causation== {{main|Probabilistic causation}} Interpreting [[causality|causation]] as a [[Causal determinism|deterministic]] relation means that if ''A'' causes ''B'', then ''A'' must always be followed by ''B''. In this sense, however, war does not always cause deaths (see [[Cyberwarfare]]), nor does a singular moment of [[Tobacco smoking|smoking]] always cause [[cancer]]. As a result, many turn to a notion of [[probabilistic causation]]. Informally, ''A'' '''probabilistically''' causes ''B'' if ''A''<nowiki>'</nowiki>s occurrence increases the probability of ''B''. This is sometimes interpreted to reflect the imperfect knowledge of a deterministic system but other times interpreted to mean that the causal system under study has an inherently indeterministic nature. ([[Propensity probability]] is an analogous idea, according to which probabilities have an objective existence and are not just limitations in a subject's knowledge).<ref>[http://plato.stanford.edu/entries/probability-interpret/ Stanford Encyclopedia of Philosophy: ''Interpretations of Philosophy'']</ref> It can be proved that realizations of any [[probability distribution]] other than the [[uniform distribution (continuous)|uniform]] one are mathematically equal to applying a (deterministic) function (namely, an [[inverse distribution function]]) on a random variable following the latter (i.e. an "absolutely random" one<ref>The uniform distribution is the most "agnostic" distribution, representing lack of any information. [[Laplace]] in his theory of probability was apparently the first one to notice this. Currently, it can be shown using definitions of [[Entropy (information theory)|entropy]].</ref>); the probabilities are contained in the deterministic element. A simple form of demonstrating it would be shooting randomly within a square and then (deterministically) interpreting a relatively large subsquare as the more probable outcome.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)