Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Ridge regression
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Regularization technique for ill-posed problems}} {{Regression bar}} '''Ridge regression''' (also known as '''Tikhonov regularization''', named for [[Andrey Nikolayevich Tikhonov|Andrey Tikhonov]]) is a method of estimating the [[coefficient]]s of multiple-[[regression model]]s in scenarios where the independent variables are highly correlated.<ref name=Hilt>{{cite book |last1=Hilt |first1=Donald E. |last2=Seegrist |first2=Donald W. |title=Ridge, a computer program for calculating ridge regression estimates |date=1977 |doi=10.5962/bhl.title.68934 |url=https://www.biodiversitylibrary.org/bibliography/68934 }}{{pn|date=April 2022}}</ref> It has been used in many fields including econometrics, chemistry, and engineering.<ref name=Gruber /> It is a method of [[regularization (mathematics)|regularization]] of [[ill-posed problem]]s.{{efn|In [[statistics]], the method is known as '''ridge regression''', in [[machine learning]] it and its modifications are known as '''weight decay''', and with multiple independent discoveries, it is also variously known as the '''Tikhonov–Miller method''', the '''Phillips–Twomey method''', the '''constrained linear inversion''' method, '''{{math|''L''<sub>2</sub>}} regularization''', and the method of '''linear regularization'''. It is related to the [[Levenberg–Marquardt algorithm]] for [[non-linear least squares|non-linear least-squares]] problems.}} It is particularly useful to mitigate the problem of [[multicollinearity]] in [[linear regression]], which commonly occurs in models with large numbers of parameters.<ref>{{cite book |first=Peter |last=Kennedy |author-link=Peter Kennedy (economist) |title=A Guide to Econometrics |location=Cambridge |publisher=The MIT Press |edition=Fifth |year=2003 |isbn=0-262-61183-X |pages=205–206 |url=https://books.google.com/books?id=B8I5SP69e4kC&pg=PA205 }}</ref> In general, the method provides improved [[Efficient estimator|efficiency]] in parameter estimation problems in exchange for a tolerable amount of [[Bias of an estimator|bias]] (see [[bias–variance tradeoff]]).<ref>{{cite book |first=Marvin |last=Gruber |title=Improving Efficiency by Shrinkage: The James–Stein and Ridge Regression Estimators |location=Boca Raton |publisher=CRC Press |year=1998 |pages=7–15 |isbn=0-8247-0156-9 |url=https://books.google.com/books?id=wmA_R3ZFrXYC&pg=PA7 }}</ref> The theory was first introduced by Hoerl and Kennard in 1970 in their ''[[Technometrics]]'' papers "Ridge regressions: biased estimation of nonorthogonal problems" and "Ridge regressions: applications in nonorthogonal problems".<ref>{{cite journal |last1=Hoerl |first1=Arthur E. |last2=Kennard |first2=Robert W. |title=Ridge Regression: Biased Estimation for Nonorthogonal Problems |journal=Technometrics |date=1970 |volume=12 |issue=1 |pages=55–67 |doi=10.2307/1267351 |jstor=1267351 }}</ref><ref>{{cite journal |last1=Hoerl |first1=Arthur E. |last2=Kennard |first2=Robert W. |title=Ridge Regression: Applications to Nonorthogonal Problems |journal=Technometrics |date=1970 |volume=12 |issue=1 |pages=69–82 |doi=10.2307/1267352 |jstor=1267352 }}</ref><ref name=Hilt /> Ridge regression was developed as a possible solution to the imprecision of least square estimators when linear regression models have some multicollinear (highly correlated) independent variables—by creating a ridge regression estimator (RR). This provides a more precise ridge parameters estimate, as its variance and mean square estimator are often smaller than the least square estimators previously derived.<ref name=Jolliffe>{{cite book |last1=Jolliffe |first1=I. T. |title=Principal Component Analysis |date=2006 |publisher=Springer Science & Business Media |isbn=978-0-387-22440-4 |page=178 |url=https://books.google.com/books?id=6ZUMBwAAQBAJ&pg=PA178 }}</ref><ref name=Gruber>{{cite book |last1=Gruber |first1=Marvin |title=Improving Efficiency by Shrinkage: The James--Stein and Ridge Regression Estimators |date=1998 |publisher=CRC Press |isbn=978-0-8247-0156-7 |page=2 |url=https://books.google.com/books?id=wmA_R3ZFrXYC&pg=PA2 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)