Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Maximum likelihood estimation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Method of estimating the parameters of a statistical model, given observations}} {{About|the statistical techniques|computer data storage|partial-response maximum-likelihood}} In [[statistics]], '''maximum likelihood estimation''' ('''MLE''') is a method of [[estimation theory|estimating]] the [[Statistical parameter|parameters]] of an assumed [[probability distribution]], given some observed data. This is achieved by [[Mathematical optimization|maximizing]] a [[likelihood function]] so that, under the assumed [[statistical model]], the [[Realization (probability)|observed data]] is most probable. The [[point estimate|point]] in the [[parameter space]] that maximizes the likelihood function is called the maximum likelihood estimate.<ref>{{cite book |last=Rossi |first=Richard J. |title=Mathematical Statistics: An Introduction to Likelihood Based Inference |location=New York |publisher=John Wiley & Sons |year=2018 |isbn=978-1-118-77104-4 |page=227 }}</ref> The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of [[statistical inference]].<ref>{{cite book |first1=David F. |last1=Hendry |author-link=David Forbes Hendry |first2=Bent |last2=Nielsen |title=Econometric Modeling: A Likelihood Approach |location=Princeton |publisher=Princeton University Press |year=2007 |isbn=978-0-691-13128-3 }}</ref><ref>{{cite book |first1=Raymond L. |last1=Chambers |first2=David G. |last2=Steel |first3=Suojin |last3=Wang |first4=Alan |last4=Welsh |title=Maximum Likelihood Estimation for Sample Surveys |location=Boca Raton |publisher=CRC Press |year=2012 |isbn=978-1-58488-632-7 }}</ref><ref>{{cite book |first1=Michael Don |last1=Ward |author-link=Michael D. Ward |first2=John S. |last2=Ahlquist |title=Maximum Likelihood for Social Science: Strategies for Analysis |location=New York |publisher=Cambridge University Press |year=2018 |isbn=978-1-107-18582-1 }}</ref> If the likelihood function is [[Differentiable function|differentiable]], the [[derivative test]] for finding maxima can be applied. In some cases, the first-order conditions of the likelihood function can be solved analytically; for instance, the [[ordinary least squares]] estimator for a [[linear regression]] model maximizes the likelihood when the random errors are assumed to have [[Normal distribution|normal]] distributions with the same variance.<ref>{{cite book |last1=Press |first1=W.H. |last2=Flannery |first2=B.P. |last3=Teukolsky |first3=S.A. |last4=Vetterling |first4=W.T. |chapter=Least Squares as a Maximum Likelihood Estimator |title=Numerical Recipes in FORTRAN: The Art of Scientific Computing |edition=2nd |location=Cambridge |publisher=Cambridge University Press |pages=651β655 |year=1992 |isbn=0-521-43064-X |chapter-url=https://books.google.com/books?id=gn_4mpdN9WkC&pg=PA651 }}</ref> From the perspective of [[Bayesian inference]], MLE is generally equivalent to [[maximum a posteriori estimation|maximum a posteriori (MAP) estimation]] with a [[Prior probability|prior distribution]] that is [[uniform distribution (continuous)|uniform]] in the region of interest. In [[frequentist inference]], MLE is a special case of an [[extremum estimator]], with the objective function being the likelihood.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)