Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Gauss–Newton algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Example== [[File:Gauss Newton illustration.png|thumb|right|280px|Calculated curve obtained with <math>\hat\beta_1 = 0.362</math> and <math>\hat\beta_2 = 0.556</math> (in blue) versus the observed data (in red)]] In this example, the Gauss–Newton algorithm will be used to fit a model to some data by minimizing the sum of squares of errors between the data and model's predictions. In a biology experiment studying the relation between substrate concentration {{math|[''S'']}} and reaction rate in an enzyme-mediated reaction, the data in the following table were obtained. {|class="wikitable" style="text-align: center; margin-left: 1em;" ! {{mvar|i}} | 1 || 2 || 3 || 4 || 5 || 6 || 7 |- ! {{math|[''S'']}} | 0.038 || 0.194 || 0.425 || 0.626 || 1.253 || 2.500 || 3.740 |- ! Rate | 0.050 || 0.127 || 0.094 || 0.2122 || 0.2729 || 0.2665 || 0.3317 |} It is desired to find a curve (model function) of the form <math display="block">\text{rate} = \frac{V_\text{max} \cdot [S]}{K_M + [S]}</math> that fits best the data in the least-squares sense, with the parameters <math>V_\text{max}</math> and <math>K_M</math> to be determined. Denote by <math>x_i</math> and <math>y_i</math> the values of {{math|[''S'']}} and '''rate''' respectively, with <math>i = 1, \dots, 7</math>. Let <math>\beta_1 = V_\text{max}</math> and <math>\beta_2 = K_M</math>. We will find <math>\beta_1</math> and <math>\beta_2</math> such that the sum of squares of the residuals <math display="block">r_i = y_i - \frac{\beta_1 x_i}{\beta_2 + x_i}, \quad (i = 1, \dots, 7)</math> is minimized. The Jacobian <math>\mathbf{J_r}</math> of the vector of residuals <math>r_i</math> with respect to the unknowns <math>\beta_j</math> is a <math>7 \times 2</math> matrix with the <math>i</math>-th row having the entries <math display="block">\frac{\partial r_i}{\partial \beta_1} = -\frac{x_i}{\beta_2 + x_i}; \quad \frac{\partial r_i}{\partial \beta_2} = \frac{\beta_1 \cdot x_i}{\left(\beta_2 + x_i\right)^2}.</math> Starting with the initial estimates of <math>\beta_1 = 0.9</math> and <math>\beta_2 = 0.2</math>, after five iterations of the Gauss–Newton algorithm, the optimal values <math>\hat\beta_1 = 0.362</math> and <math>\hat\beta_2 = 0.556</math> are obtained. The sum of squares of residuals decreased from the initial value of 1.445 to 0.00784 after the fifth iteration. The plot in the figure on the right shows the curve determined by the model for the optimal parameters with the observed data.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)