Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Machine learning
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Regression analysis === {{Main|Regression analysis}} [[Image:Linear regression.svg|thumb|upright=1.3|Illustration of linear regression on a data set]] Regression analysis encompasses a large variety of statistical methods to estimate the relationship between input variables and their associated features. Its most common form is [[linear regression]], where a single line is drawn to best fit the given data according to a mathematical criterion such as [[ordinary least squares]]. The latter is often extended by [[regularization (mathematics)|regularisation]] methods to mitigate overfitting and bias, as in [[ridge regression]]. When dealing with non-linear problems, go-to models include [[polynomial regression]] (for example, used for trendline fitting in Microsoft Excel<ref>{{cite web|last1=Stevenson|first1=Christopher|title=Tutorial: Polynomial Regression in Excel|url=https://facultystaff.richmond.edu/~cstevens/301/Excel4.html|website=facultystaff.richmond.edu|access-date=22 January 2017|archive-date=2 June 2013|archive-url=https://web.archive.org/web/20130602200850/https://facultystaff.richmond.edu/~cstevens/301/Excel4.html|url-status=live}}</ref>), [[logistic regression]] (often used in [[statistical classification]]) or even [[kernel regression]], which introduces non-linearity by taking advantage of the [[kernel trick]] to implicitly map input variables to higher-dimensional space. [[General linear model|Multivariate linear regression]] extends the concept of linear regression to handle multiple dependent variables simultaneously. This approach estimates the relationships between a set of input variables and several output variables by fitting a [[Multidimensional system|multidimensional]] linear model. It is particularly useful in scenarios where outputs are interdependent or share underlying patterns, such as predicting multiple economic indicators or reconstructing images,<ref>{{cite journal |last1= Wanta |first1= Damian |last2= Smolik |first2= Aleksander |last3= Smolik |first3= Waldemar T. |last4= Midura |first4= Mateusz |last5= Wróblewski |first5= Przemysław |date= 2025 |title= Image reconstruction using machine-learned pseudoinverse in electrical capacitance tomography |journal= Engineering Applications of Artificial Intelligence |volume= 142|page= 109888|doi= 10.1016/j.engappai.2024.109888 |doi-access= free}}</ref> which are inherently multi-dimensional.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)