Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Mathematical optimization
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Calculus of optimization=== {{Main|Karush–Kuhn–Tucker conditions}} {{See also|Critical point (mathematics)|Differential calculus|Gradient|Hessian matrix|Definite matrix|Lipschitz continuity|Rademacher's theorem|Convex function|Convex analysis}} For unconstrained problems with twice-differentiable functions, some [[critical point (mathematics)|critical points]] can be found by finding the points where the [[gradient]] of the objective function is zero (that is, the stationary points). More generally, a zero [[subgradient]] certifies that a local minimum has been found for [[convex optimization|minimization problems with convex]] [[convex function|functions]] and other [[Rademacher's theorem|locally]] [[Lipschitz function]]s, which meet in loss function minimization of the neural network. The positive-negative momentum estimation lets to avoid the local minimum and converges at the objective function global minimum.<ref>{{Cite journal |last1=Abdulkadirov |first1=R. |last2=Lyakhov |first2=P. |last3=Bergerman |first3=M. |last4=Reznikov |first4=D. |date=February 2024 |title=Satellite image recognition using ensemble neural networks and difference gradient positive-negative momentum |url=https://linkinghub.elsevier.com/retrieve/pii/S0960077923013346 |journal=Chaos, Solitons & Fractals |language=en |volume=179 |pages=114432 |doi=10.1016/j.chaos.2023.114432|bibcode=2024CSF...17914432A }}</ref> Further, critical points can be classified using the [[positive definite matrix|definiteness]] of the [[Hessian matrix]]: If the Hessian is ''positive'' definite at a critical point, then the point is a local minimum; if the Hessian matrix is negative definite, then the point is a local maximum; finally, if indefinite, then the point is some kind of [[saddle point]]. Constrained problems can often be transformed into unconstrained problems with the help of [[Lagrange multiplier]]s. [[Lagrangian relaxation]] can also provide approximate solutions to difficult constrained problems. When the objective function is a [[convex function]], then any local minimum will also be a global minimum. There exist efficient numerical techniques for minimizing convex functions, such as [[interior-point method]]s.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)