Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Likelihood function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
====Bayesian interpretation==== In [[Bayesian inference]], although one can speak about the likelihood of any proposition or [[random variable]] given another random variable: for example the likelihood of a parameter value or of a [[statistical model]] (see [[marginal likelihood]]), given specified data or other evidence,<ref name='good1950'>I. J. Good: ''Probability and the Weighing of Evidence'' (Griffin 1950), Β§6.1</ref><ref name='jeffreys1983'>H. Jeffreys: ''Theory of Probability'' (3rd ed., Oxford University Press 1983), Β§1.22</ref><ref name='jaynes2003'>E. T. Jaynes: ''Probability Theory: The Logic of Science'' (Cambridge University Press 2003), Β§4.1</ref><ref name='lindley1980'>D. V. Lindley: ''Introduction to Probability and Statistics from a Bayesian Viewpoint. Part 1: Probability'' (Cambridge University Press 1980), Β§1.6</ref> the likelihood function remains the same entity, with the additional interpretations of (i) a [[Conditional probability distribution|conditional density]] of the data given the parameter (since the parameter is then a random variable) and (ii) a measure or amount of information brought by the data about the parameter value or even the model.<ref name='good1950'/><ref name='jeffreys1983'/><ref name='jaynes2003'/><ref name='lindley1980'/><ref name='gelmanetal2014'>A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, D. B. Rubin: ''Bayesian Data Analysis'' (3rd ed., Chapman & Hall/CRC 2014), Β§1.3</ref> Due to the introduction of a probability structure on the parameter space or on the collection of models, it is possible that a parameter value or a statistical model have a large likelihood value for given data, and yet have a low ''probability'', or vice versa.<ref name='jaynes2003'/><ref name='gelmanetal2014'/> This is often the case in medical contexts.<ref>{{citation |first1=H. C. |last1=Sox |first2=M. C. |last2=Higgins |first3=D. K. |last3=Owens |title=Medical Decision Making |edition=2nd |publisher=Wiley |year=2013 |doi=10.1002/9781118341544 |isbn=9781118341544 |at=chapters 3β4 }}</ref> Following [[Bayes' Rule]], the likelihood when seen as a conditional density can be multiplied by the [[prior probability]] density of the parameter and then normalized, to give a [[posterior probability]] density.<ref name='good1950'/><ref name='jeffreys1983'/><ref name='jaynes2003'/><ref name='lindley1980'/><ref name="gelmanetal2014"/> More generally, the likelihood of an unknown quantity <math display="inline">X</math> given another unknown quantity <math display="inline">Y</math> is proportional to the ''probability of <math display="inline">Y</math> given <math display="inline">X</math>''.<ref name='good1950'/><ref name='jeffreys1983'/><ref name='jaynes2003'/><ref name='lindley1980'/><ref name='gelmanetal2014'/>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)