Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Linear algebra
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Inner-product spaces=== {{Main|Inner product space}} Besides these basic concepts, linear algebra also studies vector spaces with additional structure, such as an [[inner product]]. The inner product is an example of a [[bilinear form]], and it gives the vector space a geometric structure by allowing for the definition of length and angles. Formally, an ''inner product'' is a map. :<math> \langle \cdot, \cdot \rangle : V \times V \to F </math> that satisfies the following three [[axiom]]s for all vectors {{math|'''u''', '''v''', '''w'''}} in {{math|''V''}} and all scalars {{math|''a''}} in {{math|''F''}}:<ref name= Jain>{{Cite book|title=Functional analysis|author=P. K. Jain, Khalil Ahmad|chapter-url=https://books.google.com/books?id=yZ68h97pnAkC&pg=PA203|page=203|chapter=5.1 Definitions and basic properties of inner product spaces and Hilbert spaces|isbn=81-224-0801-X|year=1995|edition=2nd|publisher=New Age International}}</ref><ref name="Prugovec̆ki">{{Cite book|title=Quantum mechanics in Hilbert space|author=Eduard Prugovec̆ki|chapter-url=https://books.google.com/books?id=GxmQxn2PF3IC&pg=PA18|chapter=Definition 2.1|pages=18 ''ff''|isbn=0-12-566060-X|year=1981|publisher=Academic Press|edition=2nd}}</ref> * [[complex conjugate|Conjugate]] symmetry: *:<math>\langle \mathbf u, \mathbf v\rangle =\overline{\langle \mathbf v, \mathbf u\rangle}.</math> :In <math>\mathbb{R}</math>, it is symmetric. * [[Linear]]ity in the first argument: *:<math>\begin{align} \langle a \mathbf u, \mathbf v\rangle &= a \langle \mathbf u, \mathbf v\rangle. \\ \langle \mathbf u + \mathbf v, \mathbf w\rangle &= \langle \mathbf u, \mathbf w\rangle+ \langle \mathbf v, \mathbf w\rangle. \end{align}</math> * [[Definite bilinear form|Positive-definiteness]]: *:<math>\langle \mathbf v, \mathbf v\rangle \geq 0</math> :with equality only for {{math|'''v''' {{=}} 0}}. We can define the length of a vector '''v''' in ''V'' by :<math>\|\mathbf v\|^2=\langle \mathbf v, \mathbf v\rangle,</math> and we can prove the [[Cauchy–Schwarz inequality]]: :<math>|\langle \mathbf u, \mathbf v\rangle| \leq \|\mathbf u\| \cdot \|\mathbf v\|.</math> In particular, the quantity :<math>\frac{|\langle \mathbf u, \mathbf v\rangle|}{\|\mathbf u\| \cdot \|\mathbf v\|} \leq 1,</math> and so we can call this quantity the cosine of the angle between the two vectors. Two vectors are orthogonal if {{math|⟨'''u''', '''v'''⟩ {{=}} 0}}. An orthonormal basis is a basis where all basis vectors have length 1 and are orthogonal to each other. Given any finite-dimensional vector space, an orthonormal basis could be found by the [[Gram–Schmidt]] procedure. Orthonormal bases are particularly easy to deal with, since if {{nowrap|1='''v''' = ''a''<sub>1</sub> '''v'''<sub>1</sub> + ⋯ + ''a<sub>n</sub>'' '''v'''<sub>''n''</sub>}}, then :<math>a_i = \langle \mathbf v, \mathbf v_i \rangle.</math> The inner product facilitates the construction of many useful concepts. For instance, given a transform {{math|''T''}}, we can define its [[Hermitian conjugate]] {{math|''T*''}} as the linear transform satisfying :<math> \langle T \mathbf u, \mathbf v \rangle = \langle \mathbf u, T^* \mathbf v\rangle.</math> If {{math|''T''}} satisfies {{math|''TT*'' {{=}} ''T*T''}}, we call {{math|''T''}} [[Normal matrix|normal]]. It turns out that normal matrices are precisely the matrices that have an orthonormal system of eigenvectors that span {{math|''V''}}.<!-- This is a potentially useful remark, but a proper context needs to be set for it. One can say quite simply that the [[linear]] problems of [[mathematics]]—those that exhibit [[linearity]] in their behavior—are those most likely to be solved. For example, [[differential calculus]] does a great deal with linear approximation to functions. The difference from [[nonlinearity|nonlinear]] problems is very important in practice.-->
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)