Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hessian matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Second-derivative test === {{Main|Second partial derivative test}} The Hessian matrix of a [[convex function]] is [[Positive semi-definite matrix|positive semi-definite]]. Refining this property allows us to test whether a [[Critical point (mathematics)|critical point]] <math>x</math> is a local maximum, local minimum, or a saddle point, as follows: If the Hessian is [[Positive-definite matrix|positive-definite]] at <math>x,</math> then <math>f</math> attains an isolated local minimum at <math>x.</math> If the Hessian is [[Positive-definite matrix#Negative-definite, semidefinite and indefinite matrices|negative-definite]] at <math>x,</math> then <math>f</math> attains an isolated local maximum at <math>x.</math> If the Hessian has both positive and negative [[eigenvalue]]s, then <math>x</math> is a [[saddle point]] for <math>f.</math> Otherwise the test is inconclusive. This implies that at a local minimum the Hessian is positive-semidefinite, and at a local maximum the Hessian is negative-semidefinite. For positive-semidefinite and negative-semidefinite Hessians the test is inconclusive (a critical point where the Hessian is semidefinite but not definite may be a local extremum or a saddle point). However, more can be said from the point of view of [[Morse theory]]. The [[second-derivative test]] for functions of one and two variables is simpler than the general case. In one variable, the Hessian contains exactly one second derivative; if it is positive, then <math>x</math> is a local minimum, and if it is negative, then <math>x</math> is a local maximum; if it is zero, then the test is inconclusive. In two variables, the [[determinant]] can be used, because the determinant is the product of the eigenvalues. If it is positive, then the eigenvalues are both positive, or both negative. If it is negative, then the two eigenvalues have different signs. If it is zero, then the second-derivative test is inconclusive. Equivalently, the second-order conditions that are sufficient for a local minimum or maximum can be expressed in terms of the sequence of principal (upper-leftmost) [[Minor (linear algebra)|minors]] (determinants of sub-matrices) of the Hessian; these conditions are a special case of those given in the next section for bordered Hessians for constrained optimization—the case in which the number of constraints is zero. Specifically, the sufficient condition for a minimum is that all of these principal minors be positive, while the sufficient condition for a maximum is that the minors alternate in sign, with the <math>1 \times 1</math> minor being negative.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)