Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Mean squared error
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==In regression== {{further|Reduced chi-squared statistic}} In [[regression analysis]], plotting is a more natural way to view the overall trend of the whole data. The mean of the distance from each point to the predicted regression model can be calculated, and shown as the mean squared error. The squaring is critical to reduce the complexity with negative signs. To minimize MSE, the model could be more accurate, which would mean the model is closer to actual data. One example of a linear regression using this method is the [[least squares|least squares method]]βwhich evaluates appropriateness of linear regression model to model [[Bivariate data|bivariate dataset]],<ref>{{Cite book|title=A modern introduction to probability and statistics : understanding why and how|date=2005|publisher=Springer|others=Dekking, Michel, 1946-|isbn=978-1-85233-896-1|location=London|oclc=262680588}}</ref> but whose limitation is related to known distribution of the data. The term ''mean squared error'' is sometimes used to refer to the unbiased estimate of error variance: the [[residual sum of squares]] divided by the number of [[Degrees of freedom (statistics)|degrees of freedom]]. This definition for a known, computed quantity differs from the above definition for the computed MSE of a predictor, in that a different denominator is used. The denominator is the sample size reduced by the number of model parameters estimated from the same data, (''n''β''p'') for ''p'' [[regressor]]s or (''n''β''p''β1) if an intercept is used (see [[errors and residuals in statistics]] for more details).<ref>Steel, R.G.D, and Torrie, J. H., ''Principles and Procedures of Statistics with Special Reference to the Biological Sciences.'', [[McGraw Hill]], 1960, page 288.</ref> Although the MSE (as defined in this article) is not an unbiased estimator of the error variance, it is [[consistent estimator|consistent]], given the consistency of the predictor. In regression analysis, "mean squared error", often referred to as [[mean squared prediction error]] or "out-of-sample mean squared error", can also refer to the mean value of the [[squared deviations]] of the predictions from the true values, over an out-of-sample [[test set|test space]], generated by a model estimated over a [[training set|particular sample space]]. This also is a known, computed quantity, and it varies by sample and by out-of-sample test space. In the context of [[gradient descent]] algorithms, it is common to introduce a factor of <math>1/2</math> to the MSE for ease of computation after taking the derivative. So a value which is technically half the mean of squared errors may be called the MSE.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)