Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Linear algebra
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Vector spaces== {{Main|Vector space}} Until the 19th century, linear algebra was introduced through [[systems of linear equations]] and [[matrix (mathematics)|matrices]]. In modern mathematics, the presentation through ''vector spaces'' is generally preferred, since it is more [[synthetic geometry|synthetic]], more general (not limited to the finite-dimensional case), and conceptually simpler, although more abstract. A vector space over a [[field (mathematics)|field]] {{math|''F''}} (often the field of the [[real number]]s or of the [[complex number]]s) is a [[Set (mathematics)|set]] {{math|''V''}} equipped with two [[binary operation]]s. [[element (mathematics)|Elements]] of {{math|''V''}} are called ''vectors'', and elements of ''F'' are called ''scalars''. The first operation, ''[[vector addition]]'', takes any two vectors {{math|'''v'''}} and {{math|'''w'''}} and outputs a third vector {{math|'''v''' + '''w'''}}. The second operation, ''[[scalar multiplication]]'', takes any scalar {{math|''a''}} and any vector {{math|'''v'''}} and outputs a new {{nowrap|vector {{math|''a'''''v'''}}}}. The axioms that addition and scalar multiplication must satisfy are the following. (In the list below, {{math|'''u''', '''v'''}} and {{math|'''w'''}} are arbitrary elements of {{math|''V''}}, and {{math|''a''}} and {{math|''b''}} are arbitrary scalars in the field {{math|''F''}}.)<ref>{{harvtxt|Roman|2005|loc=ch. 1, p. 27}}</ref> :{| border="0" style="width:100%;" |- | '''Axiom''' ||'''Signification''' |- | [[Associativity]] of addition || {{math|1='''u''' + ('''v''' + '''w''') = ('''u''' + '''v''') + '''w'''}} |- style="background:#F8F4FF;" | [[Commutativity]] of addition || {{math|1='''u''' + '''v''' = '''v''' + '''u'''}} |- | [[Identity element]] of addition || There exists an element {{math|'''0'''}} in {{math|''V''}}, called the ''[[zero vector]]'' (or simply ''zero''), such that {{math|1='''v''' + '''0''' = '''v'''}} for all {{math|'''v'''}} in {{math|''V''}}. |- style="background:#F8F4FF;" | [[Inverse element]]s of addition || For every {{math|'''v'''}} in {{math|''V''}}, there exists an element {{math|β'''v'''}} in {{math|''V''}}, called the ''[[additive inverse]]'' of {{math|'''v'''}}, such that {{math|1='''v''' + (β'''v''') = '''0'''}} |- | [[Distributivity]] of scalar multiplication with respect to vector addition || {{math|1=''a''('''u''' + '''v''') = ''a'''''u''' + ''a'''''v'''}} |- style="background:#F8F4FF;" | Distributivity of scalar multiplication with respect to field addition || {{math|1=(''a'' + ''b'')'''v''' = ''a'''''v''' + ''b'''''v'''}} |- | Compatibility of scalar multiplication with field multiplication || {{math|1=''a''(''b'''''v''') = (''ab'')'''v'''}}{{Efn|This axiom is not asserting the associativity of an operation, since there are two operations in question, scalar multiplication {{math|''b'''''v'''}}; and field multiplication: {{math|''ab''}}.}} |- style="background:#F8F4FF;" | Identity element of scalar multiplication || {{math|1=1'''v''' = '''v'''}}, where {{math|1}} denotes the [[multiplicative identity]] of {{mvar|F}}. |} The first four axioms mean that {{math|''V''}} is an [[abelian group]] under addition. The elements of a specific vector space may have various natures; for example, they could be [[tuple]]s, [[sequence]]s, [[function (mathematics)|function]]s, [[polynomial ring|polynomial]]s, or a [[matrix (mathematics)|matrices]]. Linear algebra is concerned with the properties of such objects that are common to all vector spaces. ===Linear maps=== {{main|Linear map}} '''Linear maps''' are [[map (mathematics)|mappings]] between vector spaces that preserve the vector-space structure. Given two vector spaces {{math|''V''}} and {{math|''W''}} over a field {{mvar|F}}, a linear map (also called, in some contexts, linear transformation or linear mapping) is a [[map (mathematics)|map]] : <math> T:V\to W </math> that is compatible with addition and scalar multiplication, that is : <math> T(\mathbf u + \mathbf v)=T(\mathbf u)+T(\mathbf v), \quad T(a \mathbf v)=aT(\mathbf v) </math> for any vectors {{math|'''u''','''v'''}} in {{math|''V''}} and scalar {{math|''a''}} in {{mvar|F}}. An equivalent condition is that for any vectors {{math|'''u''', '''v'''}} in {{math|''V''}} and scalars {{math|''a'', ''b''}} in {{mvar|F}}, one has : <math>T(a \mathbf u + b \mathbf v) = aT(\mathbf u) + bT(\mathbf v) </math>. When {{math|1=''V'' = ''W''}} are the same vector space, a linear map {{math|''T'' : ''V'' β ''V''}} is also known as a ''linear operator'' on {{mvar|V}}. A [[bijective]] linear map between two vector spaces (that is, every vector from the second space is associated with exactly one in the first) is an [[isomorphism]]. Because an isomorphism preserves linear structure, two isomorphic vector spaces are "essentially the same" from the linear algebra point of view, in the sense that they cannot be distinguished by using vector space properties. An essential question in linear algebra is testing whether a linear map is an isomorphism or not, and, if it is not an isomorphism, finding its [[Range of a function|range]] (or image) and the set of elements that are mapped to the zero vector, called the [[Kernel (linear operator)|kernel]] of the map. All these questions can be solved by using [[Gaussian elimination]] or some variant of this [[algorithm]]. ===Subspaces, span, and basis=== {{main|Linear subspace|Linear span|Basis (linear algebra)}} The study of those subsets of vector spaces that are in themselves vector spaces under the induced operations is fundamental, similarly as for many mathematical structures. These subsets are called [[linear subspace]]s. More precisely, a linear subspace of a vector space {{mvar|V}} over a field {{mvar|F}} is a [[subset]] {{mvar|W}} of {{mvar|V}} such that {{math|'''u''' + '''v'''}} and {{math|''a'''''u'''}} are in {{mvar|W}}, for every {{Math|'''u'''}}, {{Math|'''v'''}} in {{mvar|W}}, and every {{mvar|a}} in {{mvar|F}}. (These conditions suffice for implying that {{mvar|W}} is a vector space.) For example, given a linear map {{math|''T'' : ''V'' β ''W''}}, the [[image (function)|image]] {{math|''T''(''V'')}} of {{mvar|V}}, and the [[inverse image]] {{math|''T''<sup>β1</sup>('''0''')}} of {{math|'''0'''}} (called [[kernel (linear algebra)|kernel]] or null space), are linear subspaces of {{mvar|W}} and {{mvar|V}}, respectively. Another important way of forming a subspace is to consider [[linear combination]]s of a set {{mvar|S}} of vectors: the set of all sums : <math> a_1 \mathbf v_1 + a_2 \mathbf v_2 + \cdots + a_k \mathbf v_k,</math> where {{math|'''v'''<sub>1</sub>, '''v'''<sub>2</sub>, ..., '''v'''<sub>''k''</sub>}} are in {{mvar|S}}, and {{math|''a''<sub>1</sub>, ''a''<sub>2</sub>, ..., ''a''<sub>''k''</sub>}} are in {{mvar|F}} form a linear subspace called the [[Linear span|span]] of {{mvar|S}}. The span of {{mvar|S}} is also the intersection of all linear subspaces containing {{mvar|S}}. In other words, it is the smallest (for the inclusion relation) linear subspace containing {{mvar|S}}. A set of vectors is [[linearly independent]] if none is in the span of the others. Equivalently, a set {{mvar|S}} of vectors is linearly independent if the only way to express the zero vector as a linear combination of elements of {{mvar|S}} is to take zero for every coefficient {{mvar|a<sub>i</sub>}}. A set of vectors that spans a vector space is called a [[spanning set]] or [[generating set]]. If a spanning set {{mvar|S}} is ''linearly dependent'' (that is not linearly independent), then some element {{Math|'''w'''}} of {{mvar|S}} is in the span of the other elements of {{mvar|S}}, and the span would remain the same if one were to remove {{Math|'''w'''}} from {{mvar|S}}. One may continue to remove elements of {{mvar|S}} until getting a ''linearly independent spanning set''. Such a linearly independent set that spans a vector space {{mvar|V}} is called a [[Basis (linear algebra)|basis]] of {{math|''V''}}. The importance of bases lies in the fact that they are simultaneously minimal-generating sets and maximal independent sets. More precisely, if {{mvar|S}} is a linearly independent set, and {{mvar|T}} is a spanning set such that {{math|''S'' β ''T''}}, then there is a basis {{mvar|B}} such that {{math|''S'' β ''B'' β ''T''}}. Any two bases of a vector space {{math|''V''}} have the same [[cardinality]], which is called the [[Dimension (vector space)|dimension]] of {{math|''V''}}; this is the [[dimension theorem for vector spaces]]. Moreover, two vector spaces over the same field {{mvar|F}} are [[isomorphic]] if and only if they have the same dimension.<ref>{{Harvp|Axler|2015}} p. 82, Β§3.59</ref> If any basis of {{math|''V''}} (and therefore every basis) has a finite number of elements, {{math|''V''}} is a ''finite-dimensional vector space''. If {{math|''U''}} is a subspace of {{math|''V''}}, then {{math|dim ''U'' β€ dim ''V''}}. In the case where {{math|''V''}} is finite-dimensional, the equality of the dimensions implies {{math|1=''U'' = ''V''}}. If {{math|''U''<sub>1</sub>}} and {{math|''U''<sub>2</sub>}} are subspaces of {{math|''V''}}, then :<math>\dim(U_1 + U_2) = \dim U_1 + \dim U_2 - \dim(U_1 \cap U_2),</math> where {{math|''U''<sub>1</sub> + ''U''<sub>2</sub>}} denotes the span of {{math|''U''<sub>1</sub> βͺ ''U''<sub>2</sub>}}.<ref>{{Harvp|Axler|2015}} p. 23, Β§1.45</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)