Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Bayesian inference
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==History== {{Main|History of statistics#Bayesian statistics}} The term ''Bayesian'' refers to [[Thomas Bayes]] (1701β1761), who proved that probabilistic limits could be placed on an unknown event.{{Reference needed|date=July 2022}} However, it was [[Pierre-Simon Laplace]] (1749β1827) who introduced (as Principle VI) what is now called [[Bayes' theorem]] and used it to address problems in [[celestial mechanics]], medical statistics, [[Reliability (statistics)|reliability]], and [[jurisprudence]].<ref name="Stigler1986" /> Early Bayesian inference, which used uniform priors following Laplace's [[principle of insufficient reason]], was called "[[inverse probability]]" (because it [[Inductive reasoning|infer]]s backwards from observations to parameters, or from effects to causes<ref name=Fienberg2006/>). After the 1920s, "inverse probability" was largely supplanted by a collection of methods that came to be called [[frequentist statistics]].<ref name=Fienberg2006/> In the 20th century, the ideas of Laplace were further developed in two different directions, giving rise to ''objective'' and ''subjective'' currents in Bayesian practice. In the objective or "non-informative" current, the statistical analysis depends on only the model assumed, the data analyzed,<ref name="Bernardo2005"/> and the method assigning the prior, which differs from one objective Bayesian practitioner to another. In the subjective or "informative" current, the specification of the prior depends on the belief (that is, propositions on which the analysis is prepared to act), which can summarize information from experts, previous studies, etc. In the 1980s, there was a dramatic growth in research and applications of Bayesian methods, mostly attributed to the discovery of [[Markov chain Monte Carlo]] methods, which removed many of the computational problems, and an increasing interest in nonstandard, complex applications.<ref name="Wolpert2004"/> Despite growth of Bayesian research, most undergraduate teaching is still based on frequentist statistics.<ref name="Bernardo2006"/> Nonetheless, Bayesian methods are widely accepted and used, such as for example in the field of [[machine learning]].<ref name="Bishop2007" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)