Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Linear algebra
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Duality== {{main|Dual space}} A [[linear form]] is a linear map from a vector space {{mvar|V}} over a field {{mvar|F}} to the field of scalars {{mvar|F}}, viewed as a vector space over itself. Equipped by [[pointwise]] addition and multiplication by a scalar, the linear forms form a vector space, called the '''dual space''' of {{mvar|V}}, and usually denoted {{mvar|V*}}<ref>{{Harvp|Katznelson|Katznelson|2008}} p. 37 §2.1.3</ref> or {{mvar|V{{prime}}}}.<ref>{{Harvp|Halmos|1974}} p. 20, §13</ref><ref>{{Harvp|Axler|2015}} p. 101, §3.94</ref> If {{math|'''v'''<sub>1</sub>, ..., '''v'''<sub>''n''</sub>}} is a basis of {{mvar|V}} (this implies that {{mvar|V}} is finite-dimensional), then one can define, for {{math|1=''i'' = 1, ..., ''n''}}, a linear map {{math|''v<sub>i</sub>''*}} such that {{math|''v<sub>i</sub>''*('''v'''<sub>''i''</sub>) {{=}} 1}} and {{math|''v<sub>i</sub>''*('''v'''<sub>''j''</sub>) {{=}} 0}} if {{math|''j'' ≠ ''i''}}. These linear maps form a basis of {{math|''V''*}}, called the [[dual basis]] of {{math|'''v'''<sub>1</sub>, ..., '''v'''<sub>''n''</sub>}}. (If {{mvar|V}} is not finite-dimensional, the {{math|''v<sub>i</sub>''*}} may be defined similarly; they are linearly independent, but do not form a basis.) For {{math|'''v'''}} in {{mvar|V}}, the map :<math>f\to f(\mathbf v)</math> is a linear form on {{mvar|V*}}. This defines the [[canonical map|canonical linear map]] from {{mvar|V}} into {{math|(''V''*)*}}, the dual of {{mvar|V*}}, called the '''[[double dual]]''' or '''[[bidual]]''' of {{mvar|V}}. This canonical map is an [[isomorphism]] if {{mvar|V}} is finite-dimensional, and this allows identifying {{mvar|V}} with its bidual. (In the infinite-dimensional case, the canonical map is injective, but not surjective.) There is thus a complete symmetry between a finite-dimensional vector space and its dual. This motivates the frequent use, in this context, of the [[bra–ket notation]] :<math>\langle f, \mathbf x\rangle</math> for denoting {{math|''f''('''x''')}}. ===Dual map=== {{main|Transpose of a linear map}} Let :<math>f:V\to W</math> be a linear map. For every linear form {{mvar|h}} on {{mvar|W}}, the [[composite function]] {{math|''h'' ∘ ''f''}} is a linear form on {{mvar|V}}. This defines a linear map :<math>f^*:W^*\to V^*</math> between the dual spaces, which is called the '''dual''' or the '''transpose''' of {{mvar|f}}. If {{mvar|V}} and {{mvar|W}} are finite-dimensional, and {{mvar|M}} is the matrix of {{mvar|f}} in terms of some ordered bases, then the matrix of {{mvar|f*}} over the dual bases is the [[transpose]] {{math|''M''<sup>T</sup>}} of {{mvar|M}}, obtained by exchanging rows and columns. If elements of vector spaces and their duals are represented by column vectors, this duality may be expressed in [[bra–ket notation]] by :<math>\langle h^\mathsf T , M \mathbf v\rangle = \langle h^\mathsf T M, \mathbf v\rangle.</math> To highlight this symmetry, the two members of this equality are sometimes written :<math>\langle h^\mathsf T \mid M \mid \mathbf v\rangle.</math> ===Inner-product spaces=== {{Main|Inner product space}} Besides these basic concepts, linear algebra also studies vector spaces with additional structure, such as an [[inner product]]. The inner product is an example of a [[bilinear form]], and it gives the vector space a geometric structure by allowing for the definition of length and angles. Formally, an ''inner product'' is a map. :<math> \langle \cdot, \cdot \rangle : V \times V \to F </math> that satisfies the following three [[axiom]]s for all vectors {{math|'''u''', '''v''', '''w'''}} in {{math|''V''}} and all scalars {{math|''a''}} in {{math|''F''}}:<ref name= Jain>{{Cite book|title=Functional analysis|author=P. K. Jain, Khalil Ahmad|chapter-url=https://books.google.com/books?id=yZ68h97pnAkC&pg=PA203|page=203|chapter=5.1 Definitions and basic properties of inner product spaces and Hilbert spaces|isbn=81-224-0801-X|year=1995|edition=2nd|publisher=New Age International}}</ref><ref name="Prugovec̆ki">{{Cite book|title=Quantum mechanics in Hilbert space|author=Eduard Prugovec̆ki|chapter-url=https://books.google.com/books?id=GxmQxn2PF3IC&pg=PA18|chapter=Definition 2.1|pages=18 ''ff''|isbn=0-12-566060-X|year=1981|publisher=Academic Press|edition=2nd}}</ref> * [[complex conjugate|Conjugate]] symmetry: *:<math>\langle \mathbf u, \mathbf v\rangle =\overline{\langle \mathbf v, \mathbf u\rangle}.</math> :In <math>\mathbb{R}</math>, it is symmetric. * [[Linear]]ity in the first argument: *:<math>\begin{align} \langle a \mathbf u, \mathbf v\rangle &= a \langle \mathbf u, \mathbf v\rangle. \\ \langle \mathbf u + \mathbf v, \mathbf w\rangle &= \langle \mathbf u, \mathbf w\rangle+ \langle \mathbf v, \mathbf w\rangle. \end{align}</math> * [[Definite bilinear form|Positive-definiteness]]: *:<math>\langle \mathbf v, \mathbf v\rangle \geq 0</math> :with equality only for {{math|'''v''' {{=}} 0}}. We can define the length of a vector '''v''' in ''V'' by :<math>\|\mathbf v\|^2=\langle \mathbf v, \mathbf v\rangle,</math> and we can prove the [[Cauchy–Schwarz inequality]]: :<math>|\langle \mathbf u, \mathbf v\rangle| \leq \|\mathbf u\| \cdot \|\mathbf v\|.</math> In particular, the quantity :<math>\frac{|\langle \mathbf u, \mathbf v\rangle|}{\|\mathbf u\| \cdot \|\mathbf v\|} \leq 1,</math> and so we can call this quantity the cosine of the angle between the two vectors. Two vectors are orthogonal if {{math|⟨'''u''', '''v'''⟩ {{=}} 0}}. An orthonormal basis is a basis where all basis vectors have length 1 and are orthogonal to each other. Given any finite-dimensional vector space, an orthonormal basis could be found by the [[Gram–Schmidt]] procedure. Orthonormal bases are particularly easy to deal with, since if {{nowrap|1='''v''' = ''a''<sub>1</sub> '''v'''<sub>1</sub> + ⋯ + ''a<sub>n</sub>'' '''v'''<sub>''n''</sub>}}, then :<math>a_i = \langle \mathbf v, \mathbf v_i \rangle.</math> The inner product facilitates the construction of many useful concepts. For instance, given a transform {{math|''T''}}, we can define its [[Hermitian conjugate]] {{math|''T*''}} as the linear transform satisfying :<math> \langle T \mathbf u, \mathbf v \rangle = \langle \mathbf u, T^* \mathbf v\rangle.</math> If {{math|''T''}} satisfies {{math|''TT*'' {{=}} ''T*T''}}, we call {{math|''T''}} [[Normal matrix|normal]]. It turns out that normal matrices are precisely the matrices that have an orthonormal system of eigenvectors that span {{math|''V''}}.<!-- This is a potentially useful remark, but a proper context needs to be set for it. One can say quite simply that the [[linear]] problems of [[mathematics]]—those that exhibit [[linearity]] in their behavior—are those most likely to be solved. For example, [[differential calculus]] does a great deal with linear approximation to functions. The difference from [[nonlinearity|nonlinear]] problems is very important in practice.-->
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)