Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Least squares
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Approximation method in statistics}} {{distinguish-redirect|Least squares approximation|Least-squares function approximation}} {{Regression bar}} {{longlead|date=April 2025}} [[File:Linear least squares2.svg|right|thumb|The result of fitting a set of data points with a quadratic function]] [[File:X33-ellips-1.svg|thumb|Conic fitting a set of points using least-squares approximation]] In [[regression analysis]], '''least squares''' is a [[parameter estimation]] method in which the sum of the squares of the [[Residuals (statistics)|residuals]] (a residual being the difference between an observed value and the fitted value provided by a model) is minimized. The most important application is in [[curve fitting|data fitting]]. When the problem has substantial [[Uncertainty|uncertainties]] in the [[independent variable]] (the ''x'' variable), then simple regression and least-squares methods have problems; in such cases, the methodology required for fitting [[errors-in-variables models]] may be considered instead of that for least squares. Least squares problems fall into two categories: linear or [[ordinary least squares]] and [[nonlinear least squares]], depending on whether or not the model functions are linear in all unknowns. The linear least-squares problem occurs in statistical [[regression analysis]]; it has a [[closed-form solution]]. The nonlinear problem is usually solved by [[iterative refinement]]; at each iteration the system is approximated by a linear one, and thus the core calculation is similar in both cases. [[Polynomial least squares]] describes the [[variance]] in a prediction of the dependent variable as a function of the independent variable and the [[Deviation (statistics)|deviations]] from the fitted curve. When the observations come from an [[exponential family]] with identity as its natural sufficient statistics and mild-conditions are satisfied (e.g. for [[Normal distribution|normal]], [[Exponential distribution|exponential]], [[Poisson distribution|Poisson]] and [[Binomial distribution|binomial distributions]]), standardized least-squares estimates and [[Maximum likelihood|maximum-likelihood]] estimates are identical.<ref>{{Cite journal | last1 = Charnes | first1 = A. | last2 = Frome | first2 = E. L. | last3 = Yu | first3 = P. L. | doi = 10.1080/01621459.1976.10481508 | title = The Equivalence of Generalized Least Squares and Maximum Likelihood Estimates in the Exponential Family | journal = Journal of the American Statistical Association | volume = 71 | issue = 353 | pages = 169β171 | year = 1976 }}</ref> The method of least squares can also be derived as a [[method of moments (statistics)|method of moments]] estimator. The following discussion is mostly presented in terms of [[linear]] functions but the use of least squares is valid and practical for more general families of functions. Also, by iteratively applying local quadratic approximation to the likelihood (through the [[Fisher information]]), the least-squares method may be used to fit a [[generalized linear model]]. The least-squares method was officially discovered and published by [[Adrien-Marie Legendre]] (1805),<ref>Mansfield Merriman, "A List of Writings Relating to the Method of Least Squares"</ref> though it is usually also co-credited to [[Carl Friedrich Gauss]] (1809),<ref name=brertscher>{{cite book |last=Bretscher |first=Otto |title=Linear Algebra With Applications |edition=3rd |publisher=Prentice Hall |year=1995 |location=Upper Saddle River, NJ}}</ref><ref name=":5">{{cite journal |first=Stephen M. |last=Stigler |year=1981 |title=Gauss and the Invention of Least Squares |journal=Ann. Stat. |volume=9 |issue=3 |pages=465β474 |doi=10.1214/aos/1176345451 |doi-access=free }}</ref> who contributed significant theoretical advances to the method,<ref name=":5" /> and may have also used it in his earlier work in 1794 and 1795.<ref name=":3">{{Cite journal |last=Plackett |first=R.L. |author-link=R. L. Plackett |date=1972 |title=The discovery of the method of least squares |url=https://hedibert.org/wp-content/uploads/2016/08/plackett1972-thediscoveryofthemethodofleastsquares.pdf |journal=Biometrika |volume=59 |issue=2 |pages=239β251}}</ref><ref name=":5" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)