Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Bayesian statistics
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Bayesian inference === {{Main|Bayesian inference}} Bayesian inference refers to [[statistical inference]] where uncertainty in inferences is quantified using probability.<ref>{{Cite journal |last=Lee|first=Se Yoon| title = Gibbs sampler and coordinate ascent variational inference: A set-theoretical review|journal=Communications in Statistics - Theory and Methods|year=2021|volume=51 |issue=6 |pages=1549β1568 |doi=10.1080/03610926.2021.1921214|arxiv=2008.01006|s2cid=220935477 }}</ref> In classical [[frequentist inference]], model [[parameter]]s and hypotheses are considered to be fixed. Probabilities are not assigned to parameters or hypotheses in frequentist inference. For example, it would not make sense in frequentist inference to directly assign a probability to an event that can only happen once, such as the result of the next flip of a fair coin. However, it would make sense to state that the proportion of heads [[Law of large numbers|approaches one-half]] as the number of coin flips increases.<ref name="wakefield2013">{{cite book |last1=Wakefield |first1=Jon |title=Bayesian and frequentist regression methods |date=2013 |publisher=Springer |location=New York, NY |isbn=978-1-4419-0924-4}}</ref> [[Statistical models]] specify a set of statistical assumptions and processes that represent how the sample data are generated. Statistical models have a number of parameters that can be modified. For example, a coin can be represented as samples from a [[Bernoulli distribution]], which models two possible outcomes. The Bernoulli distribution has a single parameter equal to the probability of one outcome, which in most cases is the probability of landing on heads. Devising a good model for the data is central in Bayesian inference. In most cases, models only approximate the true process, and may not take into account certain factors influencing the data.<ref name="bda" /> In Bayesian inference, probabilities can be assigned to model parameters. Parameters can be represented as [[random variable]]s. Bayesian inference uses Bayes' theorem to update probabilities after more evidence is obtained or known.<ref name="bda" /><ref name="congdon2014">{{cite book |last1=Congdon |first1=Peter |title=Applied Bayesian modelling |date=2014 |publisher=Wiley |isbn=978-1119951513 |edition=2nd}}</ref> Furthermore, Bayesian methods allow for placing priors on entire models and calculating their posterior probabilities using Bayes' theorem. These posterior probabilities are proportional to the product of the prior and the marginal likelihood, where the marginal likelihood is the integral of the sampling density over the prior distribution of the parameters. In complex models, marginal likelihoods are generally computed numerically.<ref name="chib1995">{{cite journal |last=Chib |first=Siddhartha |title=Marginal Likelihood from the Gibbs Output |journal=Journal of the American Statistical Association |year=1995 |volume=90 |issue=432 |pages=1313-1321 |doi=10.1080/01621459.1995.10476635}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)