Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Gauss–Newton algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Mathematical algorithm}} [[File:Regression pic assymetrique.gif|thumb|300px|Fitting of a noisy curve by an asymmetrical peak model <math>f_{\beta}(x)</math> with parameters <math>\beta</math> by mimimizing the sum of squared residuals <math> r_i(\beta) = y_i - f_{\beta}(x_i) </math> at grid points <math> x_i </math>, using the Gauss–Newton algorithm. <br /> Top: Raw data and model.<br /> Bottom: Evolution of the normalised sum of the squares of the errors.]] The '''Gauss–Newton algorithm''' is used to solve [[non-linear least squares]] problems, which is equivalent to minimizing a sum of squared function values. It is an extension of [[Newton's method in optimization|Newton's method]] for finding a [[maxima and minima|minimum]] of a non-linear [[function (mathematics)|function]]. Since a sum of squares must be nonnegative, the algorithm can be viewed as using Newton's method to iteratively approximate [[Zero of a function|zeroes]] of the components of the sum, and thus minimizing the sum. In this sense, the algorithm is also an effective method for [[#Solving_overdetermined_systems_of_equations|solving overdetermined systems of equations]]. It has the advantage that second derivatives, which can be challenging to compute, are not required.<ref>{{cite book |first1=Ron C. |last1=Mittelhammer |first2=Douglas J. |last2=Miller |first3=George G. |last3=Judge |title=Econometric Foundations |location=Cambridge |publisher=Cambridge University Press |year=2000 |isbn=0-521-62394-4 |pages=197–198 |url=https://books.google.com/books?id=fycmsfkK6RQC&pg=PA197 }}</ref> Non-linear least squares problems arise, for instance, in [[non-linear regression]], where parameters in a model are sought such that the model is in good agreement with available observations. The method is named after the mathematicians [[Carl Friedrich Gauss]] and [[Isaac Newton]], and first appeared in Gauss's 1809 work ''Theoria motus corporum coelestium in sectionibus conicis solem ambientum''.<ref name=optimizationEncyc>{{Cite book|authorlink1=Christodoulos Floudas| last1 = Floudas | first1 = Christodoulos A. | last2=Pardalos |first2=Panos M.|title = Encyclopedia of Optimization | publisher = Springer | year = 2008 | page = 1130 | isbn = 9780387747583}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)