Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Projection (linear algebra)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Finding projection with an inner product=== Let <math>V</math> be a vector space (in this case a plane) spanned by orthogonal vectors <math>\mathbf u_1, \mathbf u_2, \dots, \mathbf u_p</math>. Let <math>y</math> be a vector. One can define a projection of <math>\mathbf y</math> onto <math>V</math> as <math display="block"> \operatorname{proj}_V \mathbf y = \frac{\mathbf y \cdot \mathbf u^i}{\mathbf u^i \cdot \mathbf u^i } \mathbf u^i </math> where repeated indices are summed over ([[Einstein notation|Einstein sum notation]]). The vector <math>\mathbf y</math> can be written as an orthogonal sum such that <math>\mathbf y = \operatorname{proj}_V \mathbf y + \mathbf z</math>. <math>\operatorname{proj}_V \mathbf y</math> is sometimes denoted as <math>\hat{\mathbf y}</math>. There is a theorem in linear algebra that states that this <math>\mathbf z</math> is the smallest distance (the ''[[orthogonal distance]]'') from <math>\mathbf y</math> to <math>V</math> and is commonly used in areas such as [[machine learning]]. [[File:Ortho projection.svg|thumb|''y'' is being projected onto the vector space ''V''.]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)