Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Line search
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Curve-fitting methods === Curve-fitting methods try to attain [[superlinear convergence]] by assuming that ''f'' has some analytic form, e.g. a polynomial of finite degree. At each iteration, there is a set of "working points" in which we know the value of ''f'' (and possibly also its derivative). Based on these points, we can compute a polynomial that fits the known values, and find its minimum analytically. The minimum point becomes a new working point, and we proceed to the next iteration:<ref name=":0" />{{Rp|location=sec.5}} * [[Newton's method in optimization|Newton's method]] is a special case of a curve-fitting method, in which the curve is a degree-two polynomial, constructed using the first and second derivatives of ''f''. If the method is started close enough to a non-degenerate local minimum (= with a positive second derivative), then it has [[quadratic convergence]]. * [[Regula falsi]] is another method that fits the function to a degree-two polynomial, but it uses the first derivative at two points, rather than the first and second derivative at the same point. If the method is started close enough to a non-degenerate local minimum, then it has superlinear convergence of order <math>\varphi \approx 1.618</math>. * ''Cubic fit'' fits to a degree-three polynomial, using both the function values and its derivative at the last two points. If the method is started close enough to a non-degenerate local minimum, then it has [[quadratic convergence]]. Curve-fitting methods have superlinear convergence when started close enough to the local minimum, but might diverge otherwise. ''Safeguarded curve-fitting methods'' simultaneously execute a linear-convergence method in parallel to the curve-fitting method. They check in each iteration whether the point found by the curve-fitting method is close enough to the interval maintained by safeguard method; if it is not, then the safeguard method is used to compute the next iterate.<ref name=":0" />{{Rp|location=5.2.3.4}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)