Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Line search
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Multi-dimensional line search == In general, we have a multi-dimensional [[objective function]] <math>f:\mathbb R^n\to\mathbb R</math>. The line-search method first finds a [[descent direction]] along which the objective function <math>f</math> will be reduced, and then computes a step size that determines how far <math>\mathbf{x}</math> should move along that direction. The descent direction can be computed by various methods, such as [[gradient descent]] or [[quasi-Newton method]]. The step size can be determined either exactly or inexactly. Here is an example gradient method that uses a line search in step 5: # Set iteration counter <math>k=0</math> and make an initial guess <math>\mathbf{x}_0</math> for the minimum. Pick <math>\epsilon</math> a tolerance. # Loop: ## Compute a [[descent direction]] <math>\mathbf{p}_k</math>. ## Define a one-dimensional function <math>h(\alpha_k)=f(\mathbf{x}_k+\alpha_k\mathbf{p}_k)</math>, representing the function value on the descent direction given the step-size. ## Find an <math>\displaystyle \alpha_k</math> that minimizes <math>h</math> over <math>\alpha_k\in\mathbb R_+</math>. ## Update <math>\mathbf{x}_{k+1}=\mathbf{x}_k+\alpha_k\mathbf{p}_k</math>, and <math display="inline"> k=k+1</math> # Until <math>\|\nabla f(\mathbf{x}_{k+1})\|<\epsilon</math> At the line search step (2.3), the algorithm may minimize ''h'' ''exactly'', by solving <math>h'(\alpha_k)=0</math>, or ''approximately'', by using one of the one-dimensional line-search methods mentioned above. It can also be solved ''loosely'', by asking for a sufficient decrease in ''h'' that does not necessarily approximate the optimum. One example of the former is [[conjugate gradient method]]. The latter is called inexact line search and may be performed in a number of ways, such as a [[backtracking line search]] or using the [[Wolfe conditions]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)