Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Mean squared error
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Measure of the error of an estimator}} {{distinguish-redirect|Mean squared deviation|Mean squared displacement}} In [[statistics]], the '''mean squared error''' ('''MSE''')<ref name=":1">{{Cite web|title=Mean Squared Error (MSE)|url=https://www.probabilitycourse.com/chapter9/9_1_5_mean_squared_error_MSE.php|access-date=2020-09-12|website=www.probabilitycourse.com}}</ref> or '''mean squared deviation''' ('''MSD''') of an [[estimator]] (of a procedure for estimating an unobserved quantity) measures the [[expected value|average]] of the squares of the [[Error (statistics)|errors]]βthat is, the average squared difference between the estimated values and the [[true value]]. MSE is a [[risk function]], corresponding to the [[expected value]] of the [[squared error loss]].<ref>{{cite book |title=Mathematical Statistics: Basic Ideas and Selected Topics |volume=I |edition=Second |last1=Bickel |first1=Peter J. |authorlink1=Peter J. Bickel |last2=Doksum |first2=Kjell A. |year=2015 |page=20 |quotation="If we use quadratic loss, our risk function is called the ''mean squared error'' (MSE) ..."}}</ref> The fact that MSE is almost always strictly positive (and not zero) is because of [[randomness]] or because the estimator [[Omitted-variable bias|does not account for information]] that could produce a more accurate estimate.<ref name="pointEstimation">{{cite book |first1=E. L. |last1=Lehmann |first2=George |last2=Casella |title=Theory of Point Estimation |publisher=Springer |location=New York |year=1998 |edition=2nd |isbn=978-0-387-98502-2 |mr=1639875}}</ref> In [[machine learning]], specifically [[empirical risk minimization]], MSE may refer to the ''empirical'' risk (the average loss on an observed data set), as an estimate of the true MSE (the true risk: the average loss on the actual population distribution). The MSE is a measure of the quality of an estimator. As it is derived from the square of [[Euclidean distance]], it is always a positive value that decreases as the error approaches zero. The MSE is the second [[moment (mathematics)|moment]] (about the origin) of the error, and thus incorporates both the [[variance]] of the estimator (how widely spread the estimates are from one [[data sample]] to another) and its [[Bias of an estimator|bias]] (how far off the average estimated value is from the true value).{{citation needed|reason=This is a known fact, you can look up "An Introduction to Statistical Learning" for a discussion, but it requires a citation|date=May 2021}} For an [[unbiased estimator]], the MSE is the variance of the estimator. Like the variance, MSE has the same units of measurement as the square of the quantity being estimated. In an analogy to [[standard deviation]], taking the square root of MSE yields the ''root-mean-square error'' or ''[[root-mean-square deviation]]'' (RMSE or RMSD), which has the same units as the quantity being estimated; for an unbiased estimator, the RMSE is the square root of the [[variance]], known as the [[standard error]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)