Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Linear algebra
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Extensions and generalizations== This section presents several related topics that do not appear generally in elementary textbooks on linear algebra but are commonly considered, in advanced mathematics, as parts of linear algebra. ===Module theory=== {{main|Module (mathematics)}} The existence of multiplicative inverses in fields is not involved in the axioms defining a vector space. One may thus replace the field of scalars by a [[ring (mathematics)|ring]] {{mvar|R}}, and this gives the structure called a '''module''' over {{mvar|R}}, or {{mvar|R}}-module. The concepts of linear independence, span, basis, and linear maps (also called [[module homomorphism]]s) are defined for modules exactly as for vector spaces, with the essential difference that, if {{mvar|R}} is not a field, there are modules that do not have any basis. The modules that have a basis are the [[free module]]s, and those that are spanned by a finite set are the [[finitely generated module]]s. Module homomorphisms between finitely generated free modules may be represented by matrices. The theory of matrices over a ring is similar to that of matrices over a field, except that [[determinant]]s exist only if the ring is [[commutative ring|commutative]], and that a square matrix over a commutative ring is [[invertible matrix|invertible]] only if its determinant has a [[multiplicative inverse]] in the ring. Vector spaces are completely characterized by their dimension (up to an isomorphism). In general, there is not such a complete classification for modules, even if one restricts oneself to finitely generated modules. However, every module is a [[cokernel]] of a homomorphism of free modules. Modules over the integers can be identified with [[abelian group]]s, since the multiplication by an integer may be identified as a repeated addition. Most of the theory of abelian groups may be extended to modules over a [[principal ideal domain]]. In particular, over a principal ideal domain, every submodule of a free module is free, and the [[fundamental theorem of finitely generated abelian groups]] may be extended straightforwardly to finitely generated modules over a principal ring. There are many rings for which there are algorithms for solving linear equations and systems of linear equations. However, these algorithms have generally a [[computational complexity]] that is much higher than similar algorithms over a field. For more details, see [[Linear equation over a ring]]. ===Multilinear algebra and tensors=== {{cleanup|section|reason=The dual space is considered above, and the section must be rewritten to give an understandable summary of this subject|date=September 2018}} In [[multilinear algebra]], one considers multivariable linear transformations, that is, mappings that are linear in each of several different variables. This line of inquiry naturally leads to the idea of the [[dual space]], the vector space {{math|''V*''}} consisting of linear maps {{math|''f'' : ''V'' β ''F''}} where ''F'' is the field of scalars. Multilinear maps {{math|''T'' : ''V<sup>n</sup>'' β ''F''}} can be described via [[tensor product]]s of elements of {{math|''V*''}}. If, in addition to vector addition and scalar multiplication, there is a bilinear vector product {{math|''V'' Γ ''V'' β ''V''}}, the vector space is called an [[Algebra over a field|algebra]]; for instance, associative algebras are algebras with an associate vector product (like the algebra of square matrices, or the algebra of polynomials). ===Topological vector spaces=== {{main|Topological vector space|Normed vector space|Hilbert space}} Vector spaces that are not finite-dimensional often require additional structure to be tractable. A [[normed vector space]] is a vector space along with a function called a [[Norm (mathematics)|norm]], which measures the "size" of elements. The norm induces a [[Metric (mathematics)|metric]], which measures the distance between elements, and induces a [[Topological space|topology]], which allows for a definition of continuous maps. The metric also allows for a definition of [[Limit (mathematics)|limits]] and [[Complete metric space|completeness]] – a normed vector space that is complete is known as a [[Banach space]]. A complete metric space along with the additional structure of an [[Inner product space|inner product]] (a conjugate symmetric [[sesquilinear form]]) is known as a [[Hilbert space]], which is in some sense a particularly well-behaved Banach space. [[Functional analysis]] applies the methods of linear algebra alongside those of [[mathematical analysis]] to study various function spaces; the central objects of study in functional analysis are [[Lp space|{{mvar|L<sup>p</sup>}} space]]s, which are Banach spaces, and especially the {{math|''L''<sup>2</sup>}} space of square-integrable functions, which is the only Hilbert space among them. Functional analysis is of particular importance to quantum mechanics, the theory of partial differential equations, digital signal processing, and electrical engineering. It also provides the foundation and theoretical framework that underlies the Fourier transform and related methods.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)