Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Backpropagation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Example loss function=== Let <math>y,y'</math> be vectors in <math>\mathbb{R}^n</math>. Select an error function <math>E(y,y')</math> measuring the difference between two outputs. The standard choice is the square of the [[Euclidean distance]] between the vectors <math>y</math> and <math>y'</math>:<math display="block">E(y,y') = \tfrac{1}{2} \lVert y-y'\rVert^2</math>The error function over <math display="inline">n</math> training examples can then be written as an average of losses over individual examples:<math display="block">E=\frac{1}{2n}\sum_x\lVert (y(x)-y'(x)) \rVert^2</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)