Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Maximum likelihood estimation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Restricted parameter space === {{Distinguish|restricted maximum likelihood}} While the domain of the likelihood function—the [[parameter space]]—is generally a finite-dimensional subset of [[Euclidean space]], additional [[Restriction (mathematics)|restriction]]s sometimes need to be incorporated into the estimation process. The parameter space can be expressed as <math display="block">\Theta = \left\{ \theta : \theta \in \mathbb{R}^{k},\; h(\theta) = 0 \right\} ~,</math> where <math>\; h(\theta) = \left[ h_{1}(\theta), h_{2}(\theta), \ldots, h_{r}(\theta) \right] \;</math> is a [[vector-valued function]] mapping <math>\, \mathbb{R}^{k} \,</math> into <math>\; \mathbb{R}^{r} ~.</math> Estimating the true parameter <math>\theta</math> belonging to <math>\Theta</math> then, as a practical matter, means to find the maximum of the likelihood function subject to the [[Constraint (mathematics)|constraint]] <math>~h(\theta) = 0 ~.</math> Theoretically, the most natural approach to this [[constrained optimization]] problem is the method of substitution, that is "filling out" the restrictions <math>\; h_{1}, h_{2}, \ldots, h_{r} \;</math> to a set <math>\; h_{1}, h_{2}, \ldots, h_{r}, h_{r+1}, \ldots, h_{k} \;</math> in such a way that <math>\; h^{\ast} = \left[ h_{1}, h_{2}, \ldots, h_{k} \right] \;</math> is a [[one-to-one function]] from <math>\mathbb{R}^{k}</math> to itself, and reparameterize the likelihood function by setting <math>\; \phi_{i} = h_{i}(\theta_{1}, \theta_{2}, \ldots, \theta_{k}) ~.</math><ref name="Silvey p79">{{cite book |first=S. D. |last=Silvey |year=1975 |title=Statistical Inference |location=London, UK |publisher=Chapman and Hall |isbn=0-412-13820-4 |page=79 |url=https://books.google.com/books?id=qIKLejbVMf4C&pg=PA79 }}</ref> Because of the equivariance of the maximum likelihood estimator, the properties of the MLE apply to the restricted estimates also.<ref>{{cite web |first=David |last=Olive |year=2004 |title=Does the MLE maximize the likelihood? |website=Southern Illinois University |url=http://lagrange.math.siu.edu/Olive/simle.pdf }}</ref> For instance, in a [[multivariate normal distribution]] the [[covariance matrix]] <math>\, \Sigma \,</math> must be [[Positive-definite matrix|positive-definite]]; this restriction can be imposed by replacing <math>\; \Sigma = \Gamma^{\mathsf{T}} \Gamma \;,</math> where <math>\Gamma</math> is a real [[upper triangular matrix]] and <math>\Gamma^{\mathsf{T}}</math> is its [[transpose]].<ref>{{cite journal |first=Daniel P. |last=Schwallie |year=1985 |title=Positive definite maximum likelihood covariance estimators |journal=Economics Letters |volume=17 |issue=1–2 |pages=115–117 |doi=10.1016/0165-1765(85)90139-9 }}</ref> In practice, restrictions are usually imposed using the method of Lagrange which, given the constraints as defined above, leads to the ''restricted likelihood equations'' <math display="block">\frac{\partial \ell}{\partial \theta} - \frac{\partial h(\theta)^\mathsf{T}}{\partial \theta} \lambda = 0</math> and <math>h(\theta) = 0 \;,</math> where <math>~ \lambda = \left[ \lambda_{1}, \lambda_{2}, \ldots, \lambda_{r}\right]^\mathsf{T} ~</math> is a column-vector of [[Lagrange multiplier]]s and <math>\; \frac{\partial h(\theta)^\mathsf{T}}{\partial \theta} \;</math> is the {{mvar|k × r}} [[Jacobian matrix]] of partial derivatives.<ref name="Silvey p79"/> Naturally, if the constraints are not binding at the maximum, the Lagrange multipliers should be zero.<ref>{{cite book |first=Jan R. |last=Magnus |year=2017 |title=Introduction to the Theory of Econometrics |location=Amsterdam |publisher=VU University Press |pages=64–65 |isbn=978-90-8659-766-6}}</ref> This in turn allows for a statistical test of the "validity" of the constraint, known as the [[Lagrange multiplier test]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)