Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Likelihood function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Function related to statistics and probability theory}} {{Bayesian statistics}} A '''likelihood function''' (often simply called the '''likelihood''') measures how well a [[statistical model]] explains [[Realization (probability)|observed data]] by calculating the probability of seeing that data under different [[Statistical parameter|parameter]] values of the model. It is constructed from the [[joint probability distribution]] of the [[random variable]] that (presumably) generated the observations.<ref>{{cite book |first1=George |last1=Casella |first2=Roger L. |last2=Berger |title=Statistical Inference |location= |publisher=Duxbury |edition=2nd |year=2002 |isbn=0-534-24312-6 |page=290 }}</ref><ref>{{cite book |first=Jon |last=Wakefield |title=Frequentist and Bayesian Regression Methods |location= |publisher=Springer |edition=1st |year=2013 |isbn=978-1-4419-0925-1 |page=36 }}</ref><ref>{{cite book |first1 = Erich L. |last1=Lehmann | first2 = George |last2 = Casella |title=Theory of Point Estimation |location= |publisher=Springer |edition=2nd |year=1998 |isbn= 0-387-98502-6 |page=444 }}</ref> When evaluated on the actual data points, it becomes a function solely of the model parameters. In [[maximum likelihood estimation]], the [[arg max|argument that maximizes]] the likelihood function serves as a [[Point estimation|point estimate]] for the unknown parameter, while the [[Fisher information]] (often approximated by the likelihood's [[Hessian matrix]] at the maximum) gives an indication of the estimate's [[Precision (statistics)|precision]]. In contrast, in [[Bayesian statistics]], the estimate of interest is the ''converse'' of the likelihood, the so-called [[posterior probability]] of the parameter given the observed data, which is calculated via [[Bayes' theorem|Bayes' rule]].<ref>{{cite book |first=Arnold |last=Zellner |title=An Introduction to Bayesian Inference in Econometrics |location=New York |publisher=Wiley |year=1971 |pages=13β14 |isbn=0-471-98165-6 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)