Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Least squares
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Differences between linear and nonlinear least squares=== * The model function, ''f'', in LLSQ (linear least squares) is a linear combination of parameters of the form <math>f = X_{i1}\beta_1 + X_{i2}\beta_2 +\cdots</math> The model may represent a straight line, a parabola or any other linear combination of functions. In NLLSQ (nonlinear least squares) the parameters appear as functions, such as <math>\beta^2, e^{\beta x}</math> and so forth. If the derivatives <math>\partial f / \partial \beta_j</math> are either constant or depend only on the values of the independent variable, the model is linear in the parameters. Otherwise, the model is nonlinear. *Need initial values for the parameters to find the solution to a NLLSQ problem; LLSQ does not require them. *Solution algorithms for NLLSQ often require that the Jacobian can be calculated similar to LLSQ. Analytical expressions for the partial derivatives can be complicated. If analytical expressions are impossible to obtain either the partial derivatives must be calculated by numerical approximation or an estimate must be made of the Jacobian, often via [[finite differences]]. *Non-convergence (failure of the algorithm to find a minimum) is a common phenomenon in NLLSQ. *LLSQ is globally concave so non-convergence is not an issue. *Solving NLLSQ is usually an iterative process which has to be terminated when a convergence criterion is satisfied. LLSQ solutions can be computed using direct methods, although problems with large numbers of parameters are typically solved with iterative methods, such as the [[Gauss–Seidel]] method. *In LLSQ the solution is unique, but in NLLSQ there may be multiple minima in the sum of squares. *Under the condition that the errors are uncorrelated with the predictor variables, LLSQ yields unbiased estimates, but even under that condition NLLSQ estimates are generally biased. These differences must be considered whenever the solution to a nonlinear least squares problem is being sought.<ref name=":1" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)