Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Loss function
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Mathematical relation assigning a probability event to a cost}} In [[mathematical optimization]] and [[decision theory]], a '''loss function''' or '''cost function''' (sometimes also called an error function)<ref name="ttf2001">{{cite book|first1=Trevor |last1=Hastie |authorlink1= |first2=Robert |last2=Tibshirani |authorlink2=Robert Tibshirani|first3=Jerome H. |last3=Friedman |authorlink3=Jerome H. Friedman |title=The Elements of Statistical Learning |publisher=Springer |year=2001 |isbn=0-387-95284-5 |page=18 |url=https://web.stanford.edu/~hastie/ElemStatLearn/}}</ref> is a function that maps an [[event (probability theory)|event]] or values of one or more variables onto a [[real number]] intuitively representing some "cost" associated with the event. An [[optimization problem]] seeks to minimize a loss function. An '''objective function''' is either a loss function or its opposite (in specific domains, variously called a [[reward function]], a [[profit function]], a [[utility function]], a [[fitness function]], etc.), in which case it is to be maximized. The loss function could include terms from several levels of the hierarchy. In statistics, typically a loss function is used for [[parameter estimation]], and the event in question is some function of the difference between estimated and true values for an instance of data. The concept, as old as [[Pierre-Simon Laplace|Laplace]], was reintroduced in statistics by [[Abraham Wald]] in the middle of the 20th century.<ref>{{cite journal |first=A. |last=Wald |title=Statistical Decision Functions |journal=Apa Psycnet |publisher=Wiley |year=1950 |url=https://psycnet.apa.org/record/1951-01400-000}}</ref> In the context of [[economics]], for example, this is usually [[economic cost]] or [[Regret (decision theory)|regret]]. In [[Statistical classification|classification]], it is the penalty for an incorrect classification of an example. In [[actuarial science]], it is used in an insurance context to model benefits paid over premiums, particularly since the works of [[Harald Cramér]] in the 1920s.<ref>{{cite book |last=Cramér |first=H. |year=1930 |title=On the mathematical theory of risk |publisher=Centraltryckeriet }}</ref> In [[optimal control]], the loss is the penalty for failing to achieve a desired value. In [[financial risk management]], the function is mapped to a monetary loss. [[File:Comparison of loss functions.png|thumb|Comparison of common loss functions ([[Mean absolute error|MAE]], [[Symmetric mean absolute percentage error|SMAE]], [[Huber loss]], and Log-Cosh Loss) used for regression]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)