Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Likelihood principle
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Proposition in statistics}} {{Bayesian statistics}} In [[statistics]], the '''likelihood principle''' is the proposition that, given a [[statistical model]], all the evidence in a [[Sampling (statistics)|sample]] relevant to model parameters is contained in the [[likelihood function]]. A likelihood function arises from a [[probability density function]] considered as a function of its distributional parameterization argument. For example, consider a model which gives the probability density function <math>\; f_X(x \mid \theta)\;</math> of observable [[random variable]] <math>\, X \,</math> as a function of a parameter <math>\,\theta~</math>. Then for a specific value <math>\,x\,</math> of <math>\,X~</math>, the function <math>\,\mathcal{L}(\theta \mid x) = f_X(x \mid \theta)\;</math> is a likelihood function of <math>\,\theta~</math>: it gives a measure of how "likely" any particular value of <math>\,\theta\,</math> is, if we know that <math>\,X\,</math> has the value <math>\,x~</math>. The density function may be a density with respect to counting measure, i.e. a [[probability mass function]]. Two likelihood functions are ''equivalent'' if one is a scalar multiple of the other.{{efn| Geometrically, if they occupy the same point in [[projective space]]. }} The '''likelihood principle''' is this: All information from the data that is relevant to inferences about the value of the model parameters is in the equivalence class to which the likelihood function belongs. The '''strong likelihood principle''' applies this same criterion to cases such as sequential experiments where the sample of data that is available results from applying a [[stopping rule]] to the observations earlier in the experiment.<ref> {{cite book |last=Dodge |first=Y. |year=2003 |title=The Oxford Dictionary of Statistical Terms |publisher=Oxford University Press |isbn=0-19-920613-9 }} </ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)