Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Mean squared error
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Loss function== Squared error loss is one of the most widely used [[loss function]]s in statistics, though its widespread use stems more from mathematical convenience than considerations of actual loss in applications. [[Carl Friedrich Gauss]], who introduced the use of mean squared error, was aware of its arbitrariness and was in agreement with objections to it on these grounds.<ref name="pointEstimation" /> The mathematical benefits of mean squared error are particularly evident in its use at analyzing the performance of [[linear regression]], as it allows one to partition the variation in a dataset into variation explained by the model and variation explained by randomness. ===Criticism=== The use of mean squared error without question has been criticized by the [[decision theory|decision theorist]] [[James Berger (statistician)|James Berger]]. Mean squared error is the negative of the expected value of one specific [[utility function]], the quadratic utility function, which may not be the appropriate utility function to use under a given set of circumstances. There are, however, some scenarios where mean squared error can serve as a good approximation to a loss function occurring naturally in an application.<ref>{{cite book |title=Statistical Decision Theory and Bayesian Analysis |url=https://archive.org/details/statisticaldecis00berg |url-access=limited |first=James O. |last=Berger |author-link=James Berger (statistician) |year=1985 |edition=2nd |publisher=Springer-Verlag |location=New York |isbn=978-0-387-96098-2 |mr=0804611 |chapter=2.4.2 Certain Standard Loss Functions |page=[https://archive.org/details/statisticaldecis00berg/page/n72 60] }}</ref> Like [[variance]], mean squared error has the disadvantage of heavily weighting [[outliers]].<ref>{{cite journal | last1 = Bermejo | first1 = Sergio | last2 = Cabestany | first2 = Joan | year = 2001 | title = Oriented principal component analysis for large margin classifiers | journal = Neural Networks | volume = 14 | issue = 10| pages = 1447β1461 | doi=10.1016/S0893-6080(01)00106-X| pmid = 11771723 }}</ref> This is a result of the squaring of each term, which effectively weights large errors more heavily than small ones. This property, undesirable in many applications, has led researchers to use alternatives such as the [[mean absolute error]], or those based on the [[median]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)